Milestone Reached in Effort to Measure Effective Education Technology


Audrey Breen

The EdTech Evidence Exchange’s new system will enable teachers nationwide to share what technology works where and why.

Though education technology was thrust into the national spotlight when the coronavirus forced millions of students to learn virtually, the U.S. was already spending between $26 to $41 billion per year on edtech prior to the pandemic. Yet to date, there is very little evidence clarifying which education technology products are effective and where they are likely to succeed. Currently, a significant amount of technology is misused, used to little effect, or not used at all, wasting precious resources.

To address this issue, the nonprofit EdTech Evidence Exchange and the University of Virginia School of Education and Human Development launched the EdTech Genome Project, a multi-year initiative designed to better understand school and district edtech implementation contexts. Ultimately, the EdTech Evidence Exchange and School of Education and Human Development aim to put an end to the billions wasted on ineffective technology – by helping educators make more informed decisions about education technology and fulfill the potential of edtech as a means to help more students succeed.

A new report published last month marked a major milestone in this ambitious endeavor.

“Before we focused our efforts on education technologies themselves, we needed to map the education contexts where technologies are used,” said Emily Barton, research assistant professor at the UVA School of Education and Human Development, a leading researcher on the EdTech Genome Project. “This report is the culmination of several years of iterative efforts to define and measure key variables of school and district contexts likely to influence the success of technologies.”

A collaborative effort of representatives across the education sector, the report announces the creation of the EdTech Context Framework, which identifies and defines ten variables most likely to have the greatest impact on the selection and implementation of education technology. Selected with consensus by a team of education researchers, nonprofit leaders, and educators themselves, the variables are the product of consensus from a diverse cross-section of the education sector. A sample of the variables the team ultimately crafted into the framework include technology selection processes, professional learning, strategic leadership support, and staff culture. The report also includes sample items from the EdTech Context Inventory, which will measure each of the ten variables.

“Our hope with these tools is to understand where educators are using edtech and help educators reflect on their own implementation contexts,” Barton said. “If a school has great success with a specific technology it is important to understand the context. For example, a technology may be quite effective when teachers feel positively about it and the technology is not in competition with other new initiatives, but not as effective when teachers are skeptical about the value and other new initiatives are competing for teachers’ attention.”

“For too long, the education sector has needed someone to do the hard work of documenting and understanding what technology actually works, where, and why,” said Bob Pianta, dean of the School of Education and Human Development. “That’s why the EdTech Genome Project is such a critical endeavor. This is about bringing together the education sector, for the first time ever, to ensure that educators around the country have the information they need to make the right decisions about what technology tools will work best for them and their students.”

Creating the EdTech Context Framework and Inventory to describe and measure these contexts was an intensely collaborative effort. Over 140 researchers, practitioners, policymakers, and industry representatives participated in the EdTech Genome Project. A central steering committee initially selected ten variables, and ten working groups each tackled one variable in depth. According to Barton, one unique element of this process was the iterative engagement of the diverse set of stakeholders at each step in the process.”

“These tools were built by the education field for the education field,” Barton said. “A national steering committee and the working groups made substantive content decisions to select and define the variables based on the literature and their professional expertise. We also gathered feedback at each step in the process, listening to educators, and then returning to refine our work with the steering committee and working groups.”

The report includes a thorough explanation of the step-by-step process.

After each working group finalized the name and definition of their variable, they began a second phase of work, creating an instrument to measure the presence and nature of each variable. Anecdotal accounts of teacher involvement in decision making are helpful, but even more so is systematically documenting the extent to which diverse educator perspectives are part of the process for selecting a new technology. The measures will help answer those questions.

“For example, we heard from education leaders their desire for help in taking things off their teachers’ plates due to their limited bandwidth,” Barton said. “Now, we can evaluate the extent to which a district has leadership that is conscious of this problem.

To date, the research team has piloted the measures, making revisions based on substantive feedback from educators. According to Barton, they are in the process of a second round of instrument revisions.

Though the pandemic has delayed the release of the full instruments, the team will continue to validate the instruments this school year.

“Our focus right now is to measure the presence of these variables in schools and validate the instruments for predominantly in-person implementation contexts,” Barton said. “Instead of asking teachers to remember their experiences pre-COVID, we plan to ask educators about their current contexts this fall and winter.”

Barton is excited about what is coming next for the EdTech Genome Project.

The EdTech Context Framework and Inventory will serve as the structure for the new EdTech Evidence Exchange Platform. The platform will guide educators to systematically document their contexts, as well as their insights and experiences selecting and implementing education technology.  The platform will match district implementation contexts based on the ten variables and key demographic features, providing an opportunity for educators to learn from the successes and failures of others across the country working in similar schools or districts. Ultimately, these insights can support schools’ effective use of edtech, leading to increased learning outcomes and potentially saving school divisions’ significant funding.

The EdTech Evidence Exchange and School of Education and Human Development are launching an ambitious data collection effort during the 2021-2022 school year to systematically document the use of education technology in mathematics in three states across the country. Researchers are aiming to collect data on the experiences of up to 10,000 educators.

“In addition to producing the tools, a goal of the EdTech Genome Project was to start a conversation about the importance of context for edtech implementation using a shared language,” Barton said. “We have been successful in reaching that goal, and we’re excited about the impact of the next phases of the project. As a researcher, I’m excited to see how what we created will spur much needed additional research to support educators’ decisions about education technology.”