EducAItion, educAItion, educAItion – COULD GENERATIVE A.I. POSE A RISK TO EDUCATIONAL STANDARDS? – September 2025

The government has recently issued guidance to schools on the use of AI, but it pays too little attention to the effect of GenAI tools on learning (as opposed to performance). This paper explores emerging research evidence on the negative impacts of GenAI tools, urging the government to exercise greater caution in its advice to schools and colleges on their adoption of AI tools – to protect key learning processes.

KEY POINTS

  • The government has recently issued guidance to schools on the use of AI, which looks in depth at issues like the implications of Generative Artificial Intelligence (GenAI) tools in terms of data protection, keeping children safe and intellectual property law.
  • That guidance recognises several potential benefits of AI in education, such as reducing some of the administrative burdens faced by teachers. However, the effect of GenAI tools on learning (as opposed to performance) has received much less attention from policymakers.
  • Research has long shown that meaningfully processing new pieces of information – in other words, thinking hard about something – is vital for retaining that information in your long-term memory. However, an over-reliance on GenAI tools could lead to ‘cognitive offloading’ (i.e. using these tools to reduce the cognitive / mental demands of a task), which would bypass these fundamental memory processes that underpin how we learn. As a result, if GenAI tools are used uncritically, there is a risk that they could disrupt or even circumvent the acquisition of new knowledge and the enhancement of existing knowledge – leading to poorer outcomes for learners of all ages.
  • Research in this field is still developing, but initial studies have found some evidence of this effect occurring. While AI can make tasks faster and feel easier for students, some studies have found the use of this technology can lead to:
    • Weaker neural (brain) activity among users of GenAI tools due to putting less effort into their learning;
    • Poorer critical thinking, problem solving and even collaboration skills, with more frequent usage of AI tools leading to more significant deficits;
    • Students initially showing better performance when using ChatGPT, only for those same students to end up performing worse than their peers when the chatbot was taken away – suggesting that students use ChatGPT as a ‘crutch’ instead of thinking hard about new material;
    • Students becoming dependent on this technology, potentially leading to ‘metacognitive laziness’, which can subsequently hinder their ability to regulate their own learning;
    • Students producing lower-quality reasoning and argumentation than their peers who use traditional search engines.
  • Moreover, the impact of AI tools appears to differ between groups of students. While research has found that high-attaining students are better able to use AI strategically to support their work, younger students and those with lower prior attainment were more likely to use it in ways that are detrimental to their learning (for example, copying and pasting work without reflection).
  • Research has also shown similar effects of GenAI tools among workers, not just students. Studies have found that while GenAI tools can help staff work more quickly, it can hinder their critical thinking and problem-solving abilities as well as making it more likely that colleagues will judge them as lazier, less competent and less diligent.
  • The Government should urgently review its guidance for schools and colleges on the use of AI in light of these research findings to ensure that GenAI tools do not undermine the improvements in school standards recorded in recent years.

SEPTEMBER 2025

CLICK HERE TO DOWNLOAD THE FULL REPORT