Is AI helping students learn less? MIT study reveals neural impact of artificial intelligence use on students

Over the past few years, with the widespread adoption of artificial intelligence (AI) in everyday life, many studies have been conducted to measure the impact this technology would have specifically on academia and education. However , for the first time, a study has managed to measure the effects of using these tools directly on brain activity when used in educational contexts.
This is a research carried out by Nataliya Kosmyna, from the MIT Media Lab (one of the most important research laboratories at the Massachusetts Institute of Technology - MIT) , which sought to determine the cognitive cost of using extensive language models or LLM (artificial intelligences such as ChatGPT) in the educational context, particularly in one of the most basic academic activities: writing an essay.
The results were more than revealing, although the academic insists that this line of research needs to continue. It became clear that using artificial intelligence in writing essays involves less brain activity.
To reach this conclusion, the study attempted to measure the cognitive load of 54 students from major Boston-area universities (MIT, Harvard, Wellesley College, Tufts, and Northeastern) when composing an essay, whether assisted or not by technology.
Thus, they were divided into three groups: the first would complete their essay with the help of LLM (specifically ChatGPT), the second would rely solely on search engines without artificial intelligence (Google searches and similar platforms), and the third was called Brain Only, since it did not have any technological support.

The university is recognized for its academic programs. Photo: iStock
In total, there were three sessions in which participants performed the same task, plus an additional one in which the LLM group stopped using technology and the people in the Brain Only group were supported by artificial intelligence.
“ We used electroencephalography (EEG) to record participants’ brain activity to assess their cognitive engagement and cognitive load, as well as to better understand neural activations during the essay-writing task,” Kosmyna explained.
The results were clear: Brain connectivity systematically decreased with the amount of external technology support. Those receiving LLM support generated the weakest overall neural coupling. This was evident in the presence of lower activation and connectivity in brain networks associated with working memory, semantic integration, and executive control.
This contrasts with the results of the Brain Only group, which stood out for presenting stronger and more far-reaching neuronal networks, especially in the so-called alpha and theta bands, which are those most closely related to skills such as creativity, memory, and the ability to maintain sustained concentration.
Additionally, during the final session, in which roles were exchanged, those who were initially assisted by ChatGPT and then had to complete their essay without the help of technology showed greater difficulty in the task, in addition to presenting weaker neural connectivity, as well as less activation of the alpha, beta, and theta networks.
In contrast, those who had been using only their brains when exposed to AI demonstrated "greater memory capacity and reactivation of the occipito-parietal and prefrontal nodes, which likely favors visual processing."
In this way, data such as the following were found: 83 percent of participants in the AI group showed a much more limited ability to cite phrases or arguments from their own essays, compared to 11 percent of those in the Brain Only group, revealing memory difficulties. Furthermore, none of the participants who used ChatGPT were able to correctly cite their essay in their first session, while in the other groups this percentage was close to 100 percent.
“While these tools offer unprecedented opportunities to enhance learning and access to information, their potential impact on cognitive development, critical thinking, and intellectual independence requires careful consideration and continued research,” the study states.

This artificial intelligence is accused of harassment. Photo: iStock
The study not only measured the participants' brain activity but also evaluated the final result. Thus, the LLM group's reported ownership of the essays in the interviews was low. In other words, this group of students did not feel ownership of the essay they had just written with the help of AI.
In contrast, those in the Brain Only group were not only able to identify their own texts, but were also able to identify them as their own and highlight the arguments expressed as part of their personal thinking.
After completing the essays, the students were interviewed about their performance. ChatGPT users cited the tool's spelling and grammar assistance as an advantage, but also expressed ethical concerns and even a sense of guilt about using these tools. Many admitted that they didn't use the technology solely as a support, but also admitted to copying and pasting entire paragraphs and ideas without stopping to review them.
MIT Media Lab researchers also had experienced teachers evaluate the essays. The report states, "These essays, while impeccable in grammar and structure, lack personal nuance and appear machine-written."
Likewise, teachers maintained that the results of the AI-powered writing were less imaginative and creative, and displayed a homogeneous structure, not only in terms of grammar but also in aspects such as vocabulary and argumentative structure, which made them repetitive.

He has restructured the offices and directly leads a 50-person team for the venture. Photo: ISTOCK
Beyond understanding the intellectual and ethical processes behind the use of artificial intelligence in academia, the MIT Media Lab warns that the consequences of these results could be devastating.
Researchers are now talking about a "cognitive debt," as they believe that delegating mental processes previously handled by humans to technology can contribute to a significant decline in critical thinking, generate a passive attitude in people, and limit key capabilities such as autonomous learning, which is crucial not only in academia but also in everyday life.
“What happens with the use of AI is similar to what previous research has called the 'Google effect,' which is that people are able to remember the source of information but not the content. With tools like ChatGPT, this becomes even more difficult, as these technologies deliver results immediately and in simple language, which can lead people to stop questioning sources, discourage continued exploration, and threaten the personal construction of knowledge,” the research concludes.
eltiempo