Outsourcing Our Minds: AI, Learning, and the Cognitive Cost

Leslie Poston:

Welcome back to PsyberSpace. I'm your host, Leslie Posten. This week, we're examining how artificial intelligence is reshaping our brains and not necessarily for the better. We're diving into the growing body of research showing that while AI might make life easier, it's also changing how we think, how we remember, how we learn. Specifically, we're going to focus on the developmental impacts, especially on children, and the cognitive trade offs of relying on tools that do our thinking for us.

Leslie Poston:

We'll explore the science on why handwriting notes, struggling to learn new things, and building memory the hard way are still essential. Spoiler alert: the harder path is usually the better one for your brain. Let's get into it. In the past few years, large language model generative AI tools like ChadGPT, Conmigo, and Grammarly have found their way into classrooms at lightning speed. I'll be referring to large language model generative AI tools simply as AI for the rest of the show.

Leslie Poston:

On paper, the potential for AI and education seems incredible. Personalized learning, twenty four seven homework help, adaptive tutoring systems. But when we look under the hood, the picture isn't all rosy. We're starting to see some troubling patterns, especially among younger students. One of the key issues is what's known as cognitive offloading.

Leslie Poston:

That means we outsource thinking, memory, or decision making to an external tool, in this case, AI. It's similar to using GPS so much that you forget how to navigate your own neighborhood. In classrooms, AI is often used to provide answers without requiring the student to struggle through a problem. This is happening even as high as college institutions at this point, so it's not isolated to children. The result?

Leslie Poston:

Students learn less deeply, remember less, and retain fewer problem solving skills. Research has found a significant negative correlation between frequent AI tool use and critical thinking abilities. One study showed that participants who heavily relied on AI tools performed worse on critical thinking assessments compared to those who used the tools less frequently. This effect was particularly pronounced among younger individuals, while those with higher education levels tended to retain stronger critical thinking skills regardless of AI tool usage. What's happening here is a form of cognitive dependency.

Leslie Poston:

Our brains are remarkably adaptive organs. They're designed to conserve energy when possible. When we repeatedly outsource mental tasks to AI, our brains essentially say, great, I don't need to maintain those neural pathways anymore. It's similar to how a muscle atrophies when it's not used regularly. The cognitive muscles responsible for critical analysis, deep reading, and complex problem solving begin to weaken when AI does that heavy lifting for us.

Leslie Poston:

The more these tools become fixtures in the learning process, the more they risk becoming cognitive crutches. Teachers report students using AI to bypass assignments entirely. A student may use ChatGPT to write an essay or solve a math problem without understanding the content. The cost isn't just academic integrity. It's a decline in the underlying thinking muscles that we rely on for life and for society.

Leslie Poston:

There's also a socio emotional element. Students rely on AI tools early and often may never develop the confidence that comes from figuring something out on your own. That internal sense of mastery matters. And if we give it away too early, it's difficult to build back later. Psychologists have long recognized the importance of self efficacy, that belief in your own ability to accomplish tasks and overcome challenges and cognitive development.

Leslie Poston:

When students consistently turn to AI at the first sign of difficulty, they miss essential opportunities to build that self efficacy. Remember notebooks, pens and pencils? There is a mountain of research showing that handwriting is far better for learning than typing or tapping. When we write by hand, we engage motor functions, visual pathways, and memory centers in the brain simultaneously. This multisensory process helps us learn and retain more information.

Leslie Poston:

Multiple neuroimaging studies have documented dramatic differences in brain activation between handwriting and typing. One study found that when students wrote by hand, researchers observed increased connectivity across visual regions, sensory processing areas, and the motor cortex brain patterns vital for memory formation. Typing, on the other hand, led to minimal activation in the same areas. This isn't just about a nostalgia for pen and paper. The science is clear.

Leslie Poston:

Handwriting activates specialized brain circuits that integrate physical movement, visual perception, and spatial awareness. Specifically, it triggers regions in the brain's parietal and occipital lobes that are less active during typing. These regions are critical for processing information at a deeper level and encoding it into long term memory. It's what scientists call embodied cognition, this idea that physical experience shapes cognitive processing. Typing tends to be more passive.

Leslie Poston:

When you type, especially quickly, you often transcribe rather than process, and that distinction is so important. Multiple studies have found that students who type notes tend to do worse on conceptual questions compared to those who handwrite them. Why? Because handwriting forces you to rephrase, to summarize, and to synthesize in the moment. You can't possibly handwrite as fast as someone speaks, so your brain has to make decisions about what's important.

Leslie Poston:

That extra cognitive effort, deciding what to write, creates stronger memory traces. Consider another study where researchers compared note taking methods among college students. Students who took handwritten notes showed better conceptual understanding and performed better on assessments taken a week later compared to laptop notetakers. The laptop users took more notes verbatim while the handwriters were processing and rephrasing information in real time, which is a more active form of learning. Now we're getting back to that sense of embodied cognition that we talked about a few minutes ago, that physical act of writing, because it encodes the information in a different and more durable way.

Leslie Poston:

Like we said before, it's especially important for younger learners. Their brains are still forming. They're forming those neural pathways tied to fine motor skills and language processing. Take away handwriting, and you take away a developmental tool. And yet knowing all of this, many schools are moving away from handwriting instruction altogether, especially cursive writing.

Leslie Poston:

That's a mistake, plain and simple. We know it supports learning, and we know that it supports memory. It might not be as fast or shiny as AI, but it's much more effective in the long run. Developmental psychologists and neuroscientists are raising red flags about the premature use of AI in classrooms. Recent research out of Harvard and other institutions is showing us that early use of AI tools can short circuit the development of core cognitive skills, attention span, working memory, and executive function.

Leslie Poston:

These are the building blocks of learning. Here's where things can get even more concerning. Kids' brains are especially sensitive to how and what they learn. Introducing AI into early education may shape not just what they know, but how they know. And once those habits form, they're hard to unlearn.

Leslie Poston:

The more a child relies on AI to answer questions or complete tasks, the less opportunity they have to practice curiosity, exploration, and frustration tolerance. These all sound like soft skills, but they're not. They're essential parts of a growing brain that can think critically and independently. Studies suggest that even young children are quick to offload cognitive processes to external tools if given the opportunity. This is worrying because it's happening during critical developmental periods when our neural pathways are being established.

Leslie Poston:

The pattern of immediately turning to AI for answers rather than engaging in hard mental effort may establish lifelong habits that diminish their cognitive abilities. And it's not just cognition that's at stake. There are also concerns about how AI interactions shape social development. Children engaging with assistance may develop different communication habits, fewer interpersonal skills, and a reduced ability to read emotional cues. In short, early exposure to AI isn't neutral.

Leslie Poston:

It shapes the brain. And unless we put some guardrails in place, it might shape our kids' brains in ways we come to regret. Let's talk about what happens to thinking when AI fills in the blanks for us. Across all age groups, we're seeing a quiet decline in critical thinking skills. Students and adults for that matter who rely on AI too heavily are less likely to analyze, question, or even to doubt the information that they passively receive.

Leslie Poston:

It's not just that AI is always wrong, though it can be. It's that it flattens complexity. When an answer is generated instantly and without effort, it robs us of the chance to form our own reasoning pathways. Those are the skills that help us evaluate, argue, interpret, and innovate. Researchers have found that frequent AI users increasingly display what's called epistemic dependence.

Leslie Poston:

They begin to doubt their own reasoning, when confronted with AI that sounds more authoritative than they do, instead of asking, Is this true? They ask, why bother thinking about it? The AI already knows. Teachers across the country are reporting that students are handing in work that looks polished but lacks depth. It's harder to spot because AI grammar is good and the formatting is clean, but underneath, it's hollow.

Leslie Poston:

There's no depth. And many of the things that are being handed in are the same. This should scare us. A generation of students who can pass assignments but can't form arguments is a problem for democracy as well as education. Even in higher education, professors are struggling with how to balance the benefits of AI tools with the need for genuine intellectual engagement.

Leslie Poston:

It's not an easy line to draw, but we have to draw it. Otherwise, we're just training people to outsource their thinking permanently. So what do we do about all this? One answer is clear: we need stronger guardrails in place, especially before college. K-twelve education, in my opinion, should be focused on helping children develop strong, resilient brains.

Leslie Poston:

That means limiting reliance on tools that give them answers too easily. Some schools are already experimenting with this. They allow limited AI use in high school, but not before. Others are creating AI free zones for certain types of assignments. Policymakers and educators must work together to develop ethical frameworks for AI and education.

Leslie Poston:

These need to be rooted in cognitive science, not tech industry hype. And they need to consider developmental appropriateness. Just because a tool exists doesn't mean it should be used at every age. Educators can implement what's called productive struggle into their teaching approach. This concept emphasizes the value of letting students grapple with challenging tasks that are slightly beyond their current abilities.

Leslie Poston:

Neuroscience research shows that this kind of effortful engagement actually produces myelin, a substance that strengthens neural connections in the brain. Struggle, and the right amount, literally builds better brain circuitry. Parents also have a role to play. They can advocate for policies that protect cognitive development. They can choose learning tools carefully and encourage kids to do things the hard way sometimes.

Leslie Poston:

It's good for them, even if it's frustrating in the moment. Guardrails aren't anti tech. They're pro brain, and we need them now. There's something psychologically powerful about struggling with a problem and then figuring it out. That moment of breakthrough wires your brain.

Leslie Poston:

It creates resilience, physically creates resilience. It builds what psychologists call productive failure. AI eliminates that struggle. It gives us answers before we even finish forming the question, and that might feel good in the moment, but it short circuits learning and brain development. And struggle isn't failure.

Leslie Poston:

Struggle is the process by which we grow. In fact, studies show that people remember things better when they've had to work harder to understand them. That friction matters. It's how long term memory is formed. Let's talk about myelin again.

Leslie Poston:

As we mentioned before, educational neuroscience research reveals that when we work through difficult problems, our brains produce myelin, a substance that coats neural pathways and increases the strength and efficiency of brain signals. The myelination process is critical for building faster and more efficient neural pathways, essentially creating better roads in the brain for information to travel. Easy answers from AI skip this necessary brain building process. The concept of desirable difficulties in learning has been well established in cognitive psychology. Coined by researchers Robert and Elizabeth Bjork, this theory suggests that introducing certain difficulties into the learning process, difficulties that are relevant to the material and appropriate to the learner's ability level, can significantly enhance long term retention and transfer of knowledge.

Leslie Poston:

These beneficial challenges force the brain to engage more deeply with the material, forming stronger neural connections in the process. Carol Dweck's research on growth mindset further supports this idea. Her work shows that individuals who embrace challenges and persist through difficulties develop greater cognitive resilience and ultimately achieve more. When students understand that struggle is not a sign of inability but rather a pathway to growth, they're more likely to engage with difficult material and develop deeper understanding. Consider a classic experiment where students were divided into two groups learning the same material.

Leslie Poston:

One group studied under conditions that made learning easy, while the other faced conditions that required more effort. In immediate testing, the easy group performed better. But when tested weeks later, the group that had to struggle showed far superior retention. This pattern has been replicated across various age groups and across various subject areas in several research studies, confirming that shortcuts can often lead to shorter lasting learning. When students rely on AI for these quick answers, they're missing out on the benefits of that friction.

Leslie Poston:

They become more passive. They learn less deeply. They might even lose confidence in their own problem solving abilities, thinking they can't do it without AI. This is a cultural shift and one we need to reverse. We must reframe struggle as strength, not a weakness, and we need to make space for it in our classrooms, our homes, and our workplaces.

Leslie Poston:

One of the lesser understood but increasingly dangerous risks of widespread AI use is the phenomenon known as hallucination, which is when an AI confidently produces information that is factually correct, that is factually incorrect, biased, or entirely fabricated. These hallucinations aren't glitches in the traditional sense. They're a feature of how large language models work. They generate plausible sounding responses by predicting patterns in language, not by checking truth. That means the better the model is at sounding authoritative, the harder it can be to spot one is wrong.

Leslie Poston:

Now combine that with the decline in human critical thinking and cognitive effort that we've been discussing. When students and even adults get used to trusting AI responses without verification, they lose the muscle memory required to fact check, cross reference, and question. This is especially troubling in a world already flooded with misinformation. If our tools are unreliable and our ability to challenge them is weakened, that is a recipe for mass confusion and manipulation. There have already been real world consequences.

Leslie Poston:

AI has been caught fabricating legal citations, misrepresenting research studies, and even inventing quotes from public figures. In academic, government, and professional settings, this has led to embarrassing and sometimes costly errors. But in K through 12 education, it's even more insidious. Young learners may not yet have the knowledge base or skepticism to question what an AI tells them, especially when it sounds confident and fluent. There's also a psychological effect.

Leslie Poston:

When humans repeatedly outsource thinking to a machine that sounds smarter or more authoritative than they do, they start to doubt their own reasoning. This is where learned helplessness and epistemic dependence take hold. Like we said before, remember, instead of asking, is this true? They're asking, bother thinking about it if the AI already knows? Rebuilding that internal compass, what researchers call epistemic vigilance, is critical.

Leslie Poston:

We need to teach people not just how to use AI, but how to doubt it, how to challenge it, and how to confirm its outputs through their own mental effort. Without that, we're risking creating a future where hallucinations aren't just an AI problem, they're a human one. Now, this doesn't mean we should ban AI from education entirely. That's neither realistic nor helpful. What we need is thoughtful integration.

Leslie Poston:

Used well, AI can be a tool for expanding access, enhancing feedback, and personalizing instruction. But we must use it intentionally. That means designing learning experiences that prioritize cognition, not convenience. For example, AI could be used to give students feedback on drafts, but not to write the drafts. It could suggest questions to explore, but not answer them outright.

Leslie Poston:

Educators can set clear boundaries. They can create assignments that require personal insight, reflection, or synthesis, things AI still struggles to fake. And they can teach students how to use AI as a thinking partner, not a shortcut. This research doesn't suggest that all AI use is harmful. In fact, moderate and strategic use of AI tools can enhance learning when they're used to scaffold more complex cognitive tasks.

Leslie Poston:

The key is ensuring that AI serves as an enhancement to human thinking rather than a replacement for it. This means explicitly teaching metacognitive skills alongside AI use, helping students understand when to use AI and when to rely on their own cognitive resources. When AI is used to provoke thought instead of replace it, it becomes a tool for learning. And that's the sweet spot. That's what we should aim for.

Leslie Poston:

If you're a parent, a teacher, or a policymaker, here are some steps you can take to protect cognitive development in the age of AI. First, limit AI tool use before high school. The younger the brain, the more it needs to engage with the world through real effort and real feedback. Create opportunities for productive struggle where children learn to persist through challenges without immediately turning to AI for solutions. For elementary and middle school students, consider designated tech free learning periods where students must rely on their internal resources to solve problems.

Leslie Poston:

This doesn't mean abandoning technology entirely, but rather being strategic about how and when it's used. Second, reintroduce handwriting wherever possible. Encourage students to write notes, journal, or brainstorm on paper. The benefits are well established. Multiple studies have confirmed that handwriting activates critical brain connections across the visual, motor, and memory regions in ways that typing simply can't replicate.

Leslie Poston:

So even for older students who prefer typing for longer assignments, encouraging handwritten outlines or initial drafts to engage these neural benefits during the conceptual phase of work is beneficial. And it can be done even in the age of remote learning and remote work by holding writing sessions on Zoom or in the classroom video software, then having the students scan in the work that they've done in real time. Third, design learning experiences that build in struggle. Let students wrestle with hard questions. Let them fail safely.

Leslie Poston:

Let them grow. Research clearly shows that this kind of productive struggle is not just about building character. It's physically strengthening neural pathways through the production of myelin in the brain. Create what education researchers call low floor, high ceiling tasks: activities that are accessible to all students, but that can be extended to challenge even the most advanced learners. Create what education researchers call low floor, high ceiling tasks activities that are accessible to all students that can be extended to challenge even the most advanced learners.

Leslie Poston:

This approach ensures everyone experiences an appropriate level of productive struggle. Fourth, talk openly about AI. Make it clear that it's a tool and not a crutch. Teach digital literacy and critical thinking alongside any tech instruction. Specifically, teach students about AI hallucinations and how to verify AI generated information as this builds essential fact checking habits that strengthen their own cognitive independence.

Leslie Poston:

This is especially important as we get more sophisticated video AI tools as well. Develop classroom protocols for appropriate AI use. For instance, students might use AI to help brainstorm ideas or check their work, but not to generate complete assignments. These guardrails help students develop discernment about when AI enhances learning versus when it undermines it. Fifth, model what deep thinking looks like.

Leslie Poston:

Whether you're a teacher or a parent, your approach to problem solving sets the tone. Show your kids that thinking hard is worth it. Demonstrate the value of slow, deliberate thought in an age of instant answers. Think out loud as you work through problems, showing students that confusion and false starts are normal parts of the thinking process. When you encounter information online, model how to evaluate its credibility rather than accepting it at face value.

Leslie Poston:

Finally, advocate for balanced technology policies in schools. Get involved in curriculum discussions at your school. Ask questions about how technology is being integrated and what safeguards are in place to ensure it's enhancing rather than replacing critical thinking. Share research on cognitive development with administrators and fellow parents. Many schools are still figuring out how to navigate AI tools in education, and informed parent voices can help shape thoughtful policies that protect cognitive development while embracing beneficial innovation.

Leslie Poston:

And remember, we're not arguing against technology. We're advocating for its thoughtful integration in ways that enhance rather than diminish human cognitive capacities. The goal is not to hold back progress, but to ensure that in our rush to embrace AI's convenience, we don't accidentally outsource the very mental processes that make us human. Thanks for listening to PsyberSpace. I'm your host, Leslie Posten, signing off.

Leslie Poston:

This episode explored the complicated relationship between AI and learning, especially how it's reshaping cognition for the next generation. The science is clear. The brain needs effort, friction, handwriting, and reflection to grow strong. If we hand over all the heavy lifting to AI, we might find that we've traded ease for erosion. A bit of housekeeping.

Leslie Poston:

We were nominated for a Women in Podcasting Award for the second year in a row. We're so excited about it. Thank you so much. The link to vote for the show will be in the show notes, and I would love your vote for Best Science Podcast. As always, until next time, stay curious, and don't forget to subscribe so you never miss a week.

Outsourcing Our Minds: AI, Learning, and the Cognitive Cost
Broadcast by