There is an uncomfortable pattern forming in workplaces, classrooms, and creative studios around the world. The people who use AI the most are becoming less capable of the kind of thinking that made them valuable in the first place.

This is not a Luddite take. I use AI every day. I build with it. I consult on it. But the data emerging from multiple research institutions in 2025 and 2026 is pointing in a direction that anyone who works with their brain should pay attention to.

The Research Is Converging

A 2025 study from SBS Swiss Business School by researcher Michael Gerlich examined 666 participants and found a significant negative correlation between frequent AI tool usage and critical thinking ability. The mechanism is something called cognitive offloading: the habitual delegation of reasoning tasks to external systems. Younger users (17–25) were the most affected. Higher education provided some buffer, but did not eliminate the effect.

An MIT Media Lab study divided participants into three groups: those writing essays unaided, those using a search engine, and those using ChatGPT. The neural imaging results were striking. Unaided writers showed strong, distributed brain connectivity indicating deep engagement. ChatGPT users showed the weakest neural connectivity across all measures. More concerning, when ChatGPT users later switched to writing unaided, their brains showed persistent under-engagement. The effect carried over.

A 2025 Microsoft and Carnegie Mellon study found the same pattern from a different angle: higher confidence in AI’s ability to perform a task directly correlated with less critical thinking effort from the user. Workers got more efficient but less cognitively engaged.

Why This Matters More Than You Think

The analogy that keeps coming up in the literature is muscle atrophy. If an exoskeleton lifts every heavy object for you, your muscles eventually waste away. When AI systems routinely handle your writing, synthesis, and problem-solving, the neural circuits responsible for those tasks get less stimulation.

Researcher Søren Østergaard calls this “cognitive debt” — borrowed convenience that accrues long-term cognitive interest. Each instance of delegated reasoning compounds into a form of intellectual borrowing. And just like financial debt, you don’t feel the weight until you try to do something without the borrowed capacity.

Think about your own week. When was the last time you struggled with a problem for more than ten minutes before reaching for an AI tool? When did you last draft something from scratch, revise it three times, and feel the friction of making it better? That friction is where the skill lives.

The Real Split Forming in the Workforce

Harvard faculty have framed this well: if you think you are in school (or at work) to produce outputs, you might be fine with AI producing those outputs. But if you are there to learn and develop capability, the output was always just the vehicle through which that development happened. Confusing the two leads to what one researcher called “metacognitive laziness” — skipping active engagement entirely and offloading the deep layer of thought itself.

This creates a real split. On one side: people who use AI to amplify thinking they have already done. They use it to accelerate, not to replace. They maintain the struggle. On the other side: people who let AI do the thinking entirely. They become skilled at prompting machines but less capable of independent analysis.

The first group will be increasingly rare and increasingly valuable. The second group will grow larger and become more replaceable, because anyone can type a prompt.

What To Do About It

This does not mean abandoning AI. That would be foolish. It means being deliberate about how you use it. Some practical principles:

First, do the hard thinking before you open the AI tool. Draft your argument, form your hypothesis, sketch your solution. Then use AI to pressure-test, expand, or refine. The sequence matters because the cognitive work happens in the first pass, not the second.

Second, build in regular “AI-free” blocks. Write one piece per week without any AI assistance. Solve one problem end-to-end manually. Think of it as cognitive resistance training.

Third, use AI as a sparring partner rather than a ghostwriter. Ask it to challenge your reasoning rather than generate your reasoning. The difference in cognitive engagement is enormous.

Fourth, be honest about the trade-off. Speed and convenience are real benefits. But if you are building a career on your ability to think clearly, make judgments, and solve novel problems, then every time you skip that process, you are borrowing against your future capability.

The Uncomfortable Bottom Line

The people who will thrive in the next decade are not the ones who use AI the most or the least. They are the ones who use it without losing the capacity to function without it. That requires discipline in an environment designed to make discipline unnecessary.

The tool that makes you faster can also make you weaker. Whether it does depends entirely on whether you treat it as an amplifier or a replacement.