EdTech Impact Buyers’ Guide 2025 – Part 2: Evaluating Impact

This is the second instalment in our five-part series designed to help schools take a strategic, evidence-informed approach to selecting and implementing education technology.

After appropriately defining your EdTech strategy (Part 1 here), the next step is learning how to critically assess the evidence behind EdTech solutions – so you can make confident, context-aware decisions.

In a fast-growing and competitive EdTech marketplace, claims about impact are everywhere — but robust, contextualised evidence is often harder to find. What works in one school may not work in another, and products used by pathfinders or enthusiasts may have different outcomes to those used by other staff and students.

This article will guide you through how to approach product claims critically, ask the right questions, and ensure your decision-making is informed by more than just marketing.

Understanding the Types of evidence

Not all evidence is created equal. When reviewing EdTech products, it’s important to understand the different forms that evidence can take – and the strengths and limitations of each.

It’s also worth noting that perceptions of what constitutes “strong” or “reliable” evidence are often shaped by standards from outside the education sector – such as medicine or technology. While these frameworks can be helpful, they don’t always account for the complexity and variability of educational settings. That’s why it’s essential to consider evidence within its educational context, where factors like pedagogy, learner diversity, and implementation conditions significantly affect outcomes.

Below, we break down the main categories of evidence you’re likely to encounter, so you can better assess how much weight to give them in your decision-making.

Anecdotal evidence

Examples: User Testimonials, Reflections, Opinions

Anecdotal evidence is the most readily available type of evidence. It is typically based on impressions and informal observations, like those found in blog posts, product endorsements, promotional videos, and personal recommendations.

This form of evidence rarely provides enough contextual information to robustly attribute success to a specific action, strategy, or product.

Descriptive evidence

Examples: Surveys, Case Studies, Interviews, Recorded Observations

Descriptive evidence offers narrative descriptions or snapshots of conditions at a specific point in time. In academic contexts, descriptive evidence can be insightful, as it acknowledges the importance of context and that tools may be used differently by individual teachers and students.

However, descriptive evidence is frequently repurposed for marketing, meaning it can lack sufficient context about the product, school, or users involved. As a result, while it’s a step above anecdotal evidence, it may still omit critical details.

Correlational evidence

Examples: Comparative Studies, Randomised Controlled Trials (RCTs), Data Analytics

Correlational evidence is used to explore relationships between two or more variables – such as product usage and changes in attainment, engagement, or workload.

It helps identify patterns and can suggest where impact may be occurring, but it does not confirm causation. This is because changes can occur independently or in spite of the other variable, rather than because of it. For example, a positive correlation between the use of an app and increased teacher satisfaction can indicate that the two are related; it cannot be used to make inferences about one thing causing the other.

Randomised Controlled Trials (RCTs) are a more formalised type of correlational study. They aim to control for confounding variables by randomly assigning participants to intervention or control groups. In clinical settings – where environments are highly controlled – RCTs are often considered the gold standard for establishing causality.

In education, however, the picture is more complex. RCTs in this context often can’t account for the full range of factors influencing teaching and learning — such as teacher beliefs, implementation quality, student-teacher relationships, or school culture. Studies may also face practical constraints like limited resources, smaller sample sizes, or time pressures.

This is why, in EdTech and educational research more broadly, RCTs are best understood as a form of correlational evidence. While they can suggest strong associations and help reduce bias, they typically don’t reach the same level of rigour as clinical trials, and their findings should be interpreted with context in mind.

For schools and trusts, the key is to ask how closely the conditions of the study match your own – and to avoid assuming that an RCT automatically means a product will deliver the same results in your setting.

The Rising Influence of Peer Reviews 

Peer reviews have become a critical component in today’s e-commerce world, offering authentic, experience-based insight into how a product performs in real classrooms, how it evolves over time, and how it compares to alternative options.

On EdTech Impact, we have published over 10,000 independent peer reviews on education technology solutions. Using structured impact metrics developed in collaboration with University College London, the platform prompts reviewers to go beyond simple ratings and comments – encouraging them to reflect critically on key areas a product aims to improve.

This approach captures valuable perceptions of impact and how products are being used across different school contexts.

Structured review prompts help draw out rich insights across key impact areas.

Reviews on EdTech Impact can be filtered by school type and phase. After all, a review from a small school in the countryside may not be that helpful if you work in a large inner-city school!

Trialling a product in your own context

While existing evidence can guide your decision on whether a solution might work, the next step is to test it in your own setting. A useful way to do so is through a trial period. 

If you adopt a product for a trial period, make sure that you have a very clear intention for its use that aligns with your school priorities, and a very clear plan about how you will use it. This will make it easier for you to think about how, when and where you will use the product, and whether it has met your needs once the trial period ends. 

An important consideration is how easy it will be to stop using a product after the trial period. Are lesson plans or student access reliant on the product, even though it doesn’t fully satisfy your goals?

Though most trials are for 30 days, many suppliers will extend the trial period if you are willing to share your feedback with them. Just beware of online trials that require a credit card or financial sign-up, as they may automatically convert to a paid monthly / annual contract at the end of the trial! 

Consider using the PICRAT framework.

Developed by Royce Kimmons and colleagues in 2020, PICRAT combines two dimensions of technology integration: the student’s relationship with the technology (passive, interactive, creative) and the pedagogical impact of the technology (replacement, amplification, transformation). This model provides a structured way for leaders and educators to reflect on how technology is being used and where improvements could be made.

For example, you might consider whether a tool simply replaces traditional methods with digital ones, or whether it transforms the learning experience by enabling tasks that were previously impossible. By evaluating activities on both axes, you can identify small, achievable steps to move from a lower quadrant (e.g. passive-replacement) to a higher one (e.g. interactive-amplification), encouraging continuous improvement.

By recording your own experiences and applying the PICRAT framework, you can better determine how the solution fits your school’s specific needs and goals. This approach demystifies technology integration and makes it easier to focus on meaningful, incremental changes that improve learning, rather than adopting flashy, ineffective tools. For more information on using PICRAT as an evaluation tool, visit here.

PICRAT framework diagram

Be wary of (very new) AI startups

While many new AI companies offer exciting innovations, it’s important to evaluate their long-term goals.

When considering a startup, dig deeper into their vision and long-term commitment to education. Make sure their priorities are aligned with improving the student experience, and be wary of companies whose primary goal may be to attract venture capital and scale or sell quickly, leaving you with a tool that may not be supported or aligned with your school’s needs in the future. Don’t forget to find out exactly how they will take any data you enter into the tool and where/how it might be aggregated.

If you are considering the use of Artificial Intelligence (AI) in your school, you may wish to refer to this set of insights and recommendations published by a pathfinding group of 23 Multi-Academy Trusts in September 2024.  

👀 What’s Next?

This article is Part 2 of the EdTech Impact Buyers’ Guide 2025 – a five-part series designed to help schools make confident, evidence-informed decisions around EdTech.

In this instalment, we explored how to evaluate the evidence behind solutions – from understanding different types of claims, to running purposeful trials, conducting audits, and using frameworks like PICRAT to guide reflection.

Up next, we’ll take a closer look at the small print – from hidden costs and training packages to contract lengths and cancellation terms – so you can avoid costly surprises and make smarter procurement decisions.

You can catch up or follow along with each instalment below:

➡️ Part 1: Defining Your Strategy Available here
➡️ Part 2: Evaluating Evidence
you are here
➡️ Part 3: Navigating Contracts and Compliance
– Coming soon
➡️ Part 4: Prioritising Privacy and Accessibility
Coming soon
➡️ Part 5: Engaging Suppliers Strategically
– Coming soon


Updated on: 15 May 2025


SHARE:
Share on Linked In
Promotional banner - EdTech Impact's Buyers Guide 2025'Promotional banner - EdTech Impact's Buyers Guide 2025'