Skip to main content

Microgrants

LIVE Microgrants are currently closed. More details on the next microgrant cycle will be released soon. This is a great time to start developing new ideas and forming new collaborations at the intersection of learning and technology.

LIVE microgrants are designed to promote faculty and student research on consequential learning technology to help build the LIVE community and support the development of partnerships with community and corporate partners. Grants are quick turn-around, where awards are available within two months of submission and projects are expected to produce results within one semester and have a maximum timeframe of one year.

All grant recipients are expected to report the results of their project at our LIVE project day presentations (and note that students are welcome to be presenters) and contribute to the LIVE community by attending LIVE events during the year, and in the case of technology development projects, to document technological innovations. We strongly encourage grant proposals or continued partner interactions based on projects. LIVE research, development, and pre-award grant support will be available to all microgrant recipients.

These four different microgrants support several complementary aims, so microgrant applications can combine more than one project type, and in such cases the total amount of the grant may be the sum of applied-for grant types. For example, if a Research Project Grant includes LIVE tool development the total maximum funding amount is $3,500.

 

Microgrant Types

Research Project Grant ($2,000 maximum). Research projects should focus on the interaction between technology and learning, broadly construed. Typically, these will be small faculty-led projects where pilot data or proof-of-concept technology can help to attract larger federal awards. Funds can be used for participant-running costs, purchase of specialized technological resources, or hourly pay for student assistants. Application

 

Partner Grant ($2,000 maximum). For partner development grants, applicants should consult with LIVE before applying to ensure effective coordination of partner relationships. Partner development grants must include a commitment letter. The letter should clearly articulate the expectations for the partner’s contribution towards the project. Application

 

LIVE Tool Development Grant ($1,500 maximum). Tool development grants are intended to develop LIVE technical capabilities and support the formative use of LIVE-developed software. Characteristic use of these funds are a) hourly pay for students and/or staff to adapt LIVE software for a project, b) funds to purchase or extend existing LIVE hardware resources, c) funds to develop generalized procedures, or procedures in a novel domain, for the application of learning technology developed by LIVE.

In cases where graduate students serve as development partners, applicants are encouraged to It is very important to consult with the contact person listed as a contact for the relevant LIVE tool BEFORE submitting a microgrant application to ensure that the applicant understands the tool, and that the project is plausible within the constraints of the support provided.  Application

 

LIVE tool-development student grant ($1,500 maximum). Graduate students (PhD or Masters) working on student-led research projects, or undergraduate students doing Thesis, Immersion, or VUSRP summer projects can apply for funding to use or adapt LIVE tools for research projects. Funding can support hourly pay for LIVE staff or students to adapt LIVE tools for projects, to pay in-person or on-line participants to complete experiments or technology evaluations, and for additional software tools and resources.

It is very important to consult with the contact person listed for the relevant LIVE tool BEFORE submitting a microgrant application to ensure that the applicant understands the tool, and that the project is plausible within the constraints of the support provided.  Application

 

Application Process

Applications that involve LIVE tools should start with a consultation with the listed contact person to verify the plausibility of the project and to provide the project proposer a clear idea of the capabilities of the tool. Applications will be due Oct 3, 2022, and decisions on funding will be made by Oct 14, 2022. LIVE project day will be in April of 2023. 

Applications include the following sections, each of which are in the forms linked above: background, project plan, budget, and expected products. Products can include research publications, grant applications, technology development, user evaluations, and partner relationships.

 

Live Tool-development applications. These applications should include a brief email from the tool development partner verifying that the applicant has consulted with the development partner and that the partner agrees to commit to helping with the project in return for hourly pay within the limits afforded by the project budget and within the limits set by the partner’s home department and educational program in the case of partners who are graduate students. Typical hourly pay for graduate student tool development partners ranges from $30-$40/hour. 

Student applications. Student applications must include an email from a faculty immersion, thesis, or lab supervisor stating that they will support the project and help the student prepare for the LIVE spring project day, and prepare a project report for the LIVE tool archive. 

 

LIVE Tools

  1. NetsBlox. Block-based educational programming environment designed to provide access to advanced CS concepts such as distributed computing or the Internet of Things (IoT) to novice programmers. Student programs have instant access to a wide range of online data and services such as Google Maps, climate, weather, financial, movie,  music data, plotting and cloud storage among many others. Programs running on different computers can communicate with each other enabling social applications like chatrooms or collaboration as well as online multiplayer games. The goal is to motivate and engage students with relevant and interesting programming projects. Contact: Akos Ledeczi (akos.ledeczi@vanderbilt.edu)
  2. PhoneIoT. Mobile app for iOS and Android that provides access to the mobile device’s sensors and screen to NetsBlox programs running on a computer. Basically, PhoneIoT turns a phone into a configurable/programmable IoT device for students to experiment with. Sensor values can be requested on-demand by a NetsBlox program, or they can be streamed.  The phone screen can be configured to display buttons, images, text and more advanced elements such as soft joysticks. An example use case is to turn the phone into a remote controller for games running on the computer. Contact: Devin Jean (devin.c.jean@vanderbilt.edu)
  3. RoboScape Online. A 3D robot simulator that multiple users can join remotely. They can program their robots through NetsBlox and collaborate or compete in the shared space independent of their physical location. Robot control programs can also communicate with each other via NetsBlox enabling collaborative robotics. RoboScape Online supports advanced simulated robots with sensors such as GPS, Lidar and many others that would be too expensive to have in schools. Contact: Gordon Stein (gordon.stein@vanderbilt.edu)
  4. PyBlox. A Python Integrated Development Environment (IDE) that preserves as much of the NetsBlox IDE as possible including sprites, scripts, the concurrency model and all NetsBlox features related to distributed computing. It has syntax highlighting and auto completion as well as context-sensitive help and documentation. The configurable block palette lets users drag and drop blocks that turn into the corresponding Python code. PyBlox is designed to ease the transition from block-based to text-based programming. Contact: Devin Jean (devin.c.jean@vanderbilt.edu)
  5. On-line video response-collection software. It is relatively easy to show on-line participants videos and get their responses after the video, but obtaining participant responses during on-line videos is technologically challenging to do reliably and is not part of most on-line experiment-building software. This software allows participants to give responses time-locked to specific video frames in the context of Qualtrics questionnaires. Responses can be keypresses or button responses on customizable graphic objects. On-line responses can be integrated back into the experiment, impacting how upcoming stimuli are presented. Additional features can be added to enhance user experience such as response timelines, visual feedback, alerts, and more. Programming Languages, Services, and Toolkits: Vimeo, Qualtrics, JavaScript, HTML, CSS. Contact: Madison Lee (madison.j.lee@vanderbilt.edu)
  6. Betty’s Brain. Betty’s Brain is a learning by teaching environment, where middle school students learn about scientific processes (e.g., climate change, human body thermoregulation) by constructing causal models. They do this in the guise of teaching an agent, generically called Betty.  Currently, Betty’s Brain is being redeveloped as a web-based environment under an OECD project. Students and teachers around the world will use Betty’s Brain as an example of how one can learn in a digital world, by looking up reliable scientific resources and simulation tools and learning from them to construct causal models of chosen scientific processes. Student projects can include contributing to the development of more conversational feedback to support student learning, the use of embodied conversational gents for more natural dialog between human students and virtual agents that can play multiple roles, such as mentors, peers, and teachable agents, and the use of machine learning and advanced analytics to study students learning and self-regulation processes to support their learning. (Gautam Biswas will direct interested applicants to a contact person)
  7. C2STEM and SPICE:  Collaborative, Computational STEM (C2STEM) and Science Projects Integrating Computation and Engineering (SPICE) are two related projects constructed off a common web-based platform called Netsblox that uses a block-structured programming environment to help students construct computational models of scientific processes, while simultaneously learning about primary computing concepts and practices (Computational Thinking – CT). In both of these systems, we are improving the front-end (student-facing) system to make it easier for students to access web resources to help them build models, visualization and plotting tools to help students understand the behaviors generated by their models, and add intelligent agents that can interact with students in a conversational manner to support their learning and problem solving tasks. (Gautam Biswas will direct interested applicants to a contact person)
  8. Analysis of Multimodal Data: In all of our educational projects directed toward K-12 STEM education, and training projects directed at workforce learning (e.g., nurse training, soldier training) we are collecting multimodal data using video, microphones, log data from computer screens, and eye-tracking data to understand how learners work together to learn and solve problems in our simulation-based or mixed-reality environments.  Computer Science students have the opportunity to work with our team in developing and refining deep learning algorithms for multimodal data analysis.  Across various projects, we are developing a multimodal data analysis pipeline to support for online data collection and analysis across multiple systems. (Gautam Biswas will direct interested applicants to a contact person)
  9. GEM-STEP is a scripting language that allows users to create agent-based models that are controllable by either tag-based (UWB) or vision-based tracking systems.  Users can define agents, their properties and agent interactions to model most scientific systems (and presumably non-scientific systems as well). (Noel Enyedy will direct interested applicants to a contact person)
  10. Group-Based Cloud Computing (GbCC): support for collaborative modeling at the classroom level.  The GbCC system integrates the NetLogo agent-based modeling environment and the GeoGebra dynamic mathematics environment in an easy-to-author and easy-to-deploy tool for classroom activity design.  The communications features of GbCC allow interaction patterns that include (a) real-time interaction in a shared model space (permitting Participatory Simulations, distributed mathematical enactment activities etc); (b) Collective explorations (in which students pursue individual explorations and publish their results (including full model-state) to a shared and interactive gallery; and (c) a variety of structures in between (e.g., aggregating models across 2 or more students; joining subsets of the class in real-time interaction, etc). GbCC is open source, uses Node.js, and deploys easily to Heroku.  I have given grad students a half-day workshop that enabled them to make any NetLogo model into a GbCC model, and deploy it on their own Heroku account. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  11. Pozyx Extension for NetLogo.  The Pozyx real-time location system (RTLS) gives high-accuracy real-time location of wireless tags. The Pozyx extension for NetLogo connects this data stream into the NetLogo agent-based modeling environment. As a result, it is possible to set up agent-based models in which students’ movements are incorporated in real time as inputs to the model.  A common strategy is to map a student to a NetLogo agent, but there are many different possibilities.  This system is featured in two conference pubs in 2022, describing its use with 9th graders from SSMV, who designed activities with the system for use by middle schoolers in the Day of Discovery program. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  12. NetLogo.  Though NetLogo is distributed and maintained by the Center for Connected Learning at Northwestern, I (CB) am very familiar with the platform and have created multiple extensions to enable research using NetLogo.  These include support for Arduino microcontrollers, Bluetooth, 2.5D rendering of agent worlds, and the Pozyx system mentioned above.  NetLogo’s power and flexibility make it a good platform for PhD student projects.  I have used NetLogo with our secondary teacher education students for the past several years, with units on agent-based modeling in Human Geography (Social Studies) and Scientific Modeling (Science) as well as regular use in Math Visualization (Mathematics) courses.  NetLogo is easy to learn and powerful for modeling complexity. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  13. Gallery Server for persistent publishing to a shared space. The Gallery Server can be adapted to support any application that can communicate over the web and load its state (or other information) via a post message.  There are currently implementations for GeoGebra Web, for NetsBlox, for NetLogo, and for other purpose-build web-apps that I have created.  This gallery differs from GbCC in that (a) it is persistent (whereas GbCC data lasts only for the session); (b) it does not require authoring; but (c) it is less flexible, only permitting publishing, loading, and commenting; no real-time communication between students’ software.  The gallery is used by many students that have graduated from our Secondary Ed program in Math, for the GeoGebra connection.  And the NetsBlox connection is featured in a 2022 article. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  14. Wearable, hardware-hackable, programmable computer “badges”.  In partnership with Parallax Inc (leading robotics company) we created these flexible “badges” for research.  The badges have an 8-core CPU (Parallax “propeller”), wifi communications, infrared communications for peer-to-peer connections, an OLED screen, tricolor LEDs, buttons, audio out, etc. They are powered by rechargeable batteries and can be worn as badges with a lanyard.  As Vanderbilt’s WIFI begins to open up to IoT devices, it will be increasingly easy to develop and deploy badge applications and activities. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  15. Teacher Observation Tools (TOTs).  This environment was developed to support construct-based teacher observation (akin to standards-based observation) in a project with Rich Lehrer and Leona Schauble.  It is a webapp that supports collecting and annotating student artifacts, classifying them as evidence of the students’ development along the milestones indicated by the “construct map.”  The current project is focused on elementary school and the development of the Math of Measurement (Length, Angle, Area, Volume, and rational number constructs related to these).  But the construct map can be “swapped” out, so that TOTs could support any project with a standards-like hierarchy of observation-relevant standards.  Making the swap and deploying a server with the new structures would be straightforward but would require some server administration work and a small amount of development, for each new structure and server. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  16. Interactive Voice Coach (more purpose built). A tool for presenting and visually editing Text To Speech performances.  Beginning from a computer generated TTS rendering, the student can modify phonemes, pacing/prosody, and pitch.  The Voice Coach has been built for modular expansion, so student projects to expand it may be feasible. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  17. MathNet.  (more purpose built). A tool for pair/small group work in mathematics.  Mathematical objects, tools, or roles are distributed across members of small groups.  For instance, an activity has pairs of students, each controlling one geometric point. Together they control a line.  The activity asks them to create lines that have given properties.  Small-group level and whole-class level aggregation is supported. Contact: Corey Brady (corey.brady@vanderbilt.edu)
  18. Sweeping Area (more purpose built). A tool for the dynamic generation of area for elementary schoolers. Students can SWEEP line segments to create rectangles or parallelograms.  They can dissect the result, rearranging areas to see arithmetic relations.  There are also settings that allow creating an initial figure with a “geoboard” interface, enabling a broader exploration of dissection-based geometry (a key Greek approach). Contact: Corey Brady (corey.brady@vanderbilt.edu)