Every new tool changes the way we think. The calculator, for instance, freed us from tedious arithmetic, the GPS relieved us of remembering directions—and now, AI promises to shoulder much of our writing and reasoning. But at what cost?
A recent study suggests that relying on ChatGPT for simple tasks like essay writing may impact the brain’s neural connectivity and take a toll on learning skills. It’s an early signpost for a larger question: What happens to human skills when we let machines do the work for us?
ChatGPT reliance may result in weaker brain engagement
In June 2025, MIT researchers released a study that involved 54 participants. In this experiment, they studied the cognitive costs of using large language models (LLMs), namely ChatGPT, in essay writing.
Students who tackled the same essay prompt were divided into three groups: LLM users, search engine users and a brain-only group. After their initial assigned sessions, the LLM group and the brain-only group swapped conditions: The LLM group would write without any tools, while the brain-only group would use ChatGPT.
Across all sessions, the brain-only group demonstrated the strongest connectivity, memory recall and sense of ownership. However, the LLM group consistently lagged in those metrics, and the search engine group fell in between.
This study is among the first of its kind and, by the authors’ own admission, comes with limitations. Even so, the results align with a broader pattern that’s already been observed in cognitive science.
The cognitive risks of AI reliance
Luke Barr, neurologist and chief medical officer at SensIQ, likens the brain to a muscle that weakens over time if its functions aren’t actively used. “Relying too heavily on tools like AI to think, write or solve problems for us can dull key executive functions like working memory, attention control, language processing and critical reasoning,” he says.
He also notes that these effects have been observed in spatial navigation, where overreliance on GPS can gradually dull our innate sense of direction.
“Similarly, AI reliance may bypass our prefrontal cortex—the seat of high-level reasoning and planning,” he adds. “If we’re outsourcing too much cognitive effort, we reduce the opportunity for synaptic reinforcement, which is essential for memory consolidation and learning.”
Barr’s explanation echoes a broader consensus in the field. One 2022 study showed that synaptic plasticity—the strengthening and remodeling of neural connections—plays a central role in learning. When we learn something new, the memory is first encoded in the hippocampus. But for it to stick, this information must be reinforced in the cortex, the brain’s long-term storage.
Repeatedly strengthening these synapses—through recall and problem-solving while awake and replay during sleep—helps you transform memory into durable knowledge.
AI reliance breeds self-doubt
Overreliance on AI may also contribute to a decline in confidence.
“One thing I’ve noticed more and more is that students are starting to second-guess their own abilities, especially as writers,” says Cindy Chanin, founder and director of Rainbow EDU Consulting and Tutoring. She’s had students acknowledge that they put their essays through ChatGPT and believe that the rewritten version, stripped of their original voice, “sounded better.”
“That’s what concerns me most—not just the use of AI, but the erosion of confidence it can cause
in students who are still discovering their voices,” she adds.
Research points in the same direction. A study from Microsoft and Carnegie Mellon University, which involved 319 professionals who used GenAI tools weekly, revealed that confidence in these tools not only reduced critical thinking but also encouraged dependency and diminished independent problem-solving.
Should AI education start early?
Like calculators and search engines before it, AI is becoming irresistible to this generation. And if history is any guide, it’s only a matter of time before it becomes indispensable in classrooms. But not everyone is eager to ride that wave.
“AI hasn’t sparked a renaissance of learning,” observes Jessica Bartnick, CEO of Foundation for C.H.O.I.C.E, a mentoring and college access organization. Instead, “it’s ushered in a culture of shortcuts, laziness and lowered expectations.”
According to Bartnick, AI is dangerous because it provides easy access to instant answers, which tempts learners to cheat the process of discovery. “Real education requires wrestling with words and ideas, forming coherent arguments, organizing thoughts, and engaging in trial and error,” she says.
But Nari Jeter, a therapist and mother of two, argues that each generation must adapt to the tech of its time, as she did with the Internet in her school years. AI, she says, is this generation’s version of that struggle.
“I do think AI may help students tackle more complex problems, especially in the science and math fields,” she says. However, she’s concerned that their writing skills could suffer, especially since digital habits that favor quick reactions—like emojis and shorthand—are already widely used in everyday communication.
As for schoolwork, Jeter believes that children need to learn perseverance, with parents there to guide them through frustration.
AI is everywhere now, so the real safeguard isn’t resistance—it’s balance. In line with this idea, Neil Sahota, a United Nations AI adviser and UC Irvine lecturer, advocates for teaching students how to use AI because trying to ban it would only forfeit opportunities to guide positive behaviors. As a solution, he suggests that classrooms could use AI as a sounding board. For instance, history teachers could have their students fact-check AI-generated essays, or science teachers could assign students to test AI’s ideas in the lab.
“If we treat AI as just another research shortcut, we’ve lost,” he says. “That’s why we need to teach students how to question AI, how to validate its output and how to use it responsibly.”
Better ways to use AI
So how do we use AI to our benefit? Here are five strategies from experts:
1. Rebuild your metacognition
An excellent way to keep your mind sharp while using AI regularly, Barr suggests, is to rebuild your metacognition. This involves solidifying your knowledge through active recall rather than copying it.
“After using AI,” he says, “reflect on what you learned. Can you teach it back to yourself or someone else without notes? If not, you likely haven’t processed it deeply enough.”
2. Use ChatGPT to refine your knowledge
Framing ChatGPT as a source for feedback instead of answers is another way to harness AI without losing the skills you’re outsourcing, says clinical social worker Brie Scolaro of Aspire Psychotherapy. That means doing the work yourself before letting AI refine it.
“Even if your attempt is messy, the act of struggling through forces your brain to engage working memory, strengthen neural pathways and build retention,” Scolaro says. This might include drafting an email or outlining on your own before having AI smooth out your writing.
3. Practice delayed gratification
Scolaro is also concerned about declining confidence when users constantly outsource their judgment to AI. This is why you should “use AI sparingly to check facts but practice tolerating uncertainty so you don’t lose trust in your own reasoning,” they advise.
When you feel the urge to outsource an answer, Scolaro recommends delaying the impulse by five or 10 minutes. This practice trains your mind to reason through the problem and thus improves your confidence.
4. Set intentional boundaries
Equally important is setting intentional boundaries, Scolaro says. That means discerning when AI tools help you and when they undermine the skills you must protect.
For instance, you could work through math problems on your own and only use AI to spot mistakes and deepen your understanding. This process is about letting AI help you while staying aware and in control of what the machine is doing.
5. Strengthen your baseline attention
Scolaro also believes that if you use AI tools frequently, it’s important to balance that habit with those that strengthen your baseline attention—your innate ability to concentrate. Stronger baseline attention means you can focus well and use AI strategically rather than habitually. But when your attention is weak, you tend to rush to AI to compensate.
Let’s say you want to understand a document but can’t focus for long. So instead of reading it yourself, you ask AI to summarize it and believe what the summary says without fact-checking the information. That’s substituting effort, which only weakens your attentional skills further.
To instead reinforce your baseline attention, Scolaro recommends evidence-based practices like mindfulness, HRV biofeedback and even simple single-tasking.
The cost of surrendering to AI
We’ve all heard the common refrain by now: “AI is taking our jobs.” That’s not entirely wrong—but since AI isn’t going anywhere, we now have to learn to live with it. Otherwise, the next claim to float into the anti-AI discourse might be, “AI is taking our skills.”
Like any convenience, AI is a double-edged sword, so the onus is on you to use it with care. But remember: Your mind is your greatest asset. It was built to create and resolve. So if you’re surrendering your thinking to AI, you’re wasting the very gift that makes you human.
Photo by Lilly Rum/Unsplash.com