This is the fourth instalment in our five-part series designed to help schools take a strategic, evidence-informed approach to selecting and implementing education technology.
After defining your EdTech strategy (Part 1), evaluating product evidence (Part 2), and navigating contracts and compliance (Part 3), it’s time to focus on something equally vital – privacy, accessibility, and ethical design.
In an ideal world, every EdTech solution would meet essential software standards and prioritise user wellbeing. Sadly, that’s not always the case.
EdTech tools that overlook privacy, accessibility, or ethical design not only put schools at risk – they also fail to serve the diverse needs of their users.
This part of the guide outlines the key benchmarks and questions every school should consider before adopting a new solution, helping you dig deeper than a basic Data Privacy Impact Assessment (DPIA) or checkbox compliance form.
Even the most well-designed tools will raise questions or issues once they’re in regular use. A provider’s ability to respond to feedback – from technical problems to user concerns – is a strong indicator of their long-term reliability and commitment to continuous improvement.
Look for suppliers that make it easy to access help, resolve problems quickly, and involve users in shaping future development. Avoid tools that leave staff chasing generic email addresses or jumping between support systems.
Every learner deserves equitable access to technology – but accessibility isn’t just about compliance checklists. It’s about inclusive design that considers the diverse needs of your school community from the outset. Whether it’s supporting SEND students, EAL learners, or those using older devices or assistive tech, a good EdTech solution should reduce barriers, not introduce new ones.
Don’t assume that a product is accessible just because it says so. Ask how it performs across devices, how it adapts to different user needs, and which frameworks it actually aligns with. True accessibility means putting usability and inclusion at the heart of the product experience.
When EdTech is used by children, design choices matter – not just in terms of usability, but also in how they protect, empower, and respect young users. The Age-Appropriate Design Code (AADC) sets a clear benchmark for products that serve children online, emphasising privacy-by-design, transparency, and developmentally appropriate experiences.
Unfortunately, not all EdTech providers meet these expectations. Some use overly complex language in policies, or make design choices that prioritise engagement over safety. Make sure any solution used in your school aligns with the AADC or similar frameworks, and places the needs and rights of children at the heart of its design.
AI and algorithm-driven tools are becoming more common in EdTech: from personalised learning paths to content generation and student analytics. But with great power comes the need for transparency, fairness, and accountability.
Poorly designed algorithms can perpetuate bias, make opaque decisions, or produce content that’s misleading or inappropriate for learners.
If an EdTech provider uses AI, dig deeper. Ask how the systems are trained, tested, and explained. For solutions that use Generative AI (GenAI), it’s especially important to understand the source of training data, the safeguards in place, and whether outputs are age-appropriate and auditable. Remember: just because something is powered by AI doesn’t mean it’s smart, safe, or suitable for your setting.
Any EdTech solution your school adopts will likely collect, process, or store some level of personal data – which means the stakes for responsible data practices are high. From GDPR compliance to transparency around data usage, it’s essential to know exactly what happens to your students’ and staff’s information throughout its lifecycle.
A robust solution should clearly define what data is collected, how it is used, who controls it, and whether it is ever shared or repurposed for secondary purposes like marketing or product development. Don’t settle for vague assurances – request clarity, documentation, and accountability.
Strong cybersecurity isn’t just an IT concern – it’s fundamental to protecting students, staff, and the integrity of your school’s operations. Any EdTech solution you consider should demonstrate clear, proactive security measures that guard against breaches, misuse, and unauthorised access.
Beyond technical protections, it’s important to understand how the tool fits into your existing ecosystem. Does it require access to webcams, chats, or third-party tools? Can it integrate securely with your MIS or SIS? And is the provider keeping pace with evolving threats and regulatory expectations?
Trust isn’t just built on technical features – it’s rooted in the values, intentions, and cultural awareness of the people behind the product. EdTech tools that ignore ethical design or apply a one-size-fits-all approach risk alienating students and staff, or worse, causing harm through poorly thought-out features.
A truly ethical solution considers the cultural and contextual diversity of schools, involves educators in the development process, and actively reflects on the impact of its design decisions. Don’t hesitate to ask how a vendor incorporates ethics from the ground up – and whether your school community was ever part of that conversation.
Schools have a legal and moral responsibility to safeguard their students – and the tools they use should reflect that same commitment. Any EdTech provider working in the education space must clearly communicate how their product supports student wellbeing, protects vulnerable users, and aligns with age-appropriate use.
Look for visible indicators of accountability: certifications, transparent age guidelines, screen time recommendations, and evidence of independent review. A responsible provider should make it easy for schools to understand how their product supports – not undermines – their safeguarding and wellbeing responsibilities.
Understanding the people behind an EdTech tool can reveal a lot about its purpose, priorities, and long-term vision. A solution built by experienced educators, researchers, or mission-driven teams is more likely to reflect classroom realities and student needs than one created solely for commercial gain.
Dig into the background of the founders and developers. Do they have a track record in education? Are their values aligned with educational outcomes? Choosing tools made by people who care about the same things you do can increase the chances of a positive, lasting impact in your school.
This article is Part 4 of the EdTech Impact Buyers’ Guide 2025 – a five-part series designed to help schools make confident, evidence-informed decisions around EdTech.
In this instalment, we explored how to prioritise privacy, accessibility, and ethical design — from understanding GDPR compliance and cybersecurity risks to asking deeper questions about AI, inclusion, safeguarding, and the values of the people behind a product.
These aren’t just technical details — they’re the foundations of trust, safety, and long-term impact in your school community.
In the final part of the series, we’ll look at how to engage suppliers more strategically — including the critical questions to ask at events like Bett, how to cut through sales jargon, and what to do after the conversation ends. Whether you’re attending a major exhibition or contacting a supplier directly, Part 5 will help you approach those interactions with clarity, confidence, and a clear plan.
You can catch up or follow along with each instalment below:
Updated on: 16 July 2025