In the first three parts of our series, we talked about demonstrating product impact on student outcomes through high-quality edtech research to guide evidence-based decision-making about data-driven solutions and product adoption. In addition to outcomes research, a product’s research and development team should also be continually seeking data to help refine and iterate on their products.
By harnessing the power of data, we can create more effective, engaging, and scalable edtech solutions, ultimately leading to better educational outcomes for all learners. In this article, we discuss how ExploreLearning provides award-winning programs backed by research-based instructional strategies using performance data and user feedback for the continuous development of technology in the classroom.
Importance of data in edtech tools and development
Educational research must drive not only the development of edtech tools but also their ongoing improvement. A critical component of the research in education process is to gather information about authentic experiences with educational technology via user research. A data-driven approach allows edtech companies to measure and analyze student performance in real-time, which is vital for understanding how different students learn and which instructional methods are most effective.
By analyzing how users interact with their tools, companies can identify what works well and what needs improvement. In addition, regular feedback loops involving user surveys, focus groups, and interviews provide vital information about user experience, ensuring that the product evolves in line with user needs. Together, these insights drive the iterative product improvement process, leading to more effective and user-friendly edtech tools.
Interpreting student data for product enhancement
There is a lot of ongoing “behind-the-scenes” work that is done by our Product Development, Research, and Data and Analysis teams to interpret student usage data in a way that can drive meaningful product enhancements that improve the user experience and student learning outcomes. We identify key engagement metrics, such as session duration, frequency of use, and interaction with specific features, that give us a window into student experience.
The effectiveness of technology behind our adaptive programs is constantly analyzed to ensure that students are presented with problems to solve that provide an adequate level of challenge while being at just the right level to decrease frustration and increase motivation.
Collecting and analyzing user feedback
However, data patterns can only help us understand the teaching and learning experience. Gathering user insights is an important next step that allows us to create data-driven technology that addresses teachers' most important needs and preferences.
User surveys, focus groups, 1:1 interviews, and classroom observations are routinely conducted to learn more about teachers’ and students’ experiences.
Continuous iteration and innovation
We thoughtfully analyze and act on the data to improve our products and support all learners. For example, looking at user data and conducting teacher focus groups led to a recent change to the Gizmos STEM Cases heatmap. This tool allows teachers to see student progress and responses in real-time and assess their students’ knowledge and skill growth over time. The upgraded heatmap experience includes a change to reflect the more ungraded nature of conversational questions.
Student responses to these questions can be very insightful to the learning process without needing points. These questions are beneficial to engage students in critical thinking and to connect new information to prior knowledge, and the update better reflects teachers' typical use of these types of questions in the classroom as places to have meaningful conversations during learning experiences rather than graded assessments.
In another instance, a suggestion from a teacher on the Gizmos Support Form led to a complete revision of our Cell Structure Gizmo, including everything from the supplementary materials to the assessment. Every user comment and suggestion is carefully considered.
Role of the Collab Crew
Integration of edtech tools for teachers into the classroom can bring challenges. Educators understand what it takes to make these resources work for teachers and support students. To more easily connect with users, we created a co-design and feedback program called the Collab Crew.
Collab Crew members have the opportunity to work 1:1 with our Researchers, Product Managers, and Learning Designers to help develop new edtech product concepts from the ground up, test them with students in authentic settings, and provide regular feedback about classroom experiences.
Crew members get early access to new content, a direct channel for product feedback, and other classroom rewards while providing critical feedback. Collab Crew members directly impact the future of our math and science products and help us connect to user needs from the earliest stages of product design.
Future directions for ExploreLearning
Partnering with academic institutions and vendors in the educational technology industry is critical for driving innovation and building knowledge about how students learn in the classroom. Using feedback from those collaborations, providers can ensure their edtech tools are more accessible and engaging for all students.
As we continue to grow our product line to meet the current needs of teachers and students to solve the toughest problems in math and science education, we continue to rely on user input in the design and improvement of our edtech tools. Through our collaborations with districts, schools, teachers, and external research partners, we are excited to continue to support serious fun in K-12 classrooms.
Find out how one teacher partnered with ExploreLearning’s Collab Crew to provide input for Gizmos STEM Cases.
What about you? Are you ready to participate in behind-the-scenes research and product development? Just click to learn the details.
About the Author
ExploreLearning Senior Researcher Megan Conrad, Ph.D.
Dr. Megan Conrad, Senior Researcher for ExploreLearning, shares her insights in our Measuring Impact Series. In her current role, she works with district administrators, curriculum coordinators, and teachers to uncover evidence of student success from product usage and helps districts make evidence-based decisions regarding product implementation.
She earned her Ph.D. in Psychology (specializing in Developmental and Cognitive Science) from Rutgers University and has 15 years of experience designing and conducting research in child development, learning environments, and youth & technology. She previously worked in higher education as an Assistant Professor of Psychology and founded a research lab that explored children’s STEM learning in informal learning settings.
Sign up to get the latest updates from ExploreLearning via occasional email.