As AI tools become a natural part of our daily workflow, it’s getting easier to offload simple tasks, summarise reports, generate content and solve general problems. All with a single prompt.

But as we make things easier for ourselves, are we making ourselves weaker?

Not physically, cognitively?

Are we outsourcing the mental effort needed to strengthen or even maintain our intelligence?

It’s something I’ve been thinking about a lot. Don’t get me wrong, I’ve had a lot of fun using and exploring a lot of these new AI tools and have seriously boosted my productivity with their help. I now use Cursor daily for writing code, I’ve leaned into using tools like ChatGPT, NotebookLM and Perplexity, in my day-to-day. It’s an exciting time (at least for me) and these tools are genuinely powerful.

But they’re also frictionless…and that’s where the concern starts.

Because when something becomes too easy, we risk skipping the hard necessary work that turns information into understanding.


We’ve Seen This Before: Move Fast, Fix Later

Every time a new technology comes around, we rush to unleash it on the masses, and integrate it into everything. Every SaaS tool now has AI integrated. LG built an AI powered TV. Samsung released an AI enabled fridge.

Who really needs an AI-powered fridge?

But that’s beside the point.

Let’s be clear, I’m not saying that moving fast with technology is a bad thing, speed can unlock innovation. But there’s a familiar pattern: we roll out the tech, celebrate the convenience, and only later ask, “What is this doing to us?”

Social media gave us unlimited connection, and a crisis of attention. Smartphones gave us portable computing and an always-on culture.

Now AI might be delivering the next unintended consequence, a slow erosion of our ability to think1.


The Muscle We’re Not Using

Problem-solving, critical thinking and focus. These are mental muscles. The more we use them, the sharper we get. Think about being in the gym and doing resistance training increases hypertrophy (muscle growth). But like any muscle, if we stop using them, they atrophy (weaken and shrink).

When we rely on AI to do all the heavy lifting, is this what we’re risking? I’ve noticed in myself, the small resistance to thinking something through when I know I can just ask ChatGPT. The temptation to skip the mental workout. The temptation to skip the struggle.

Struggle is underrated and I fully agree with the statement “struggle is the best teacher”. It’s where understanding lives. When you figure something out the hard way, you don’t just get the answer, you get the learning. You earn clarity. Think about the last time you struggled to learn or do something and had to use your head to figure it out. What was your experience?

Now imagine a generation growing up rarely having to wrestle with hard ideas on their own.

That’s not a tech issue. That’s a cognitive shift.


From Digital Tools to Digital Dependence

The Financial Times recently asked whether we’ve reached peak brain power citing OECD data showing a decline in reasoning and problem-solving since 2012. Not just in teens, but adults too.

It aligns with a broader shift in how we interact with information.

We used to browse. Now we scroll. We used to seek. Now we’re fed. We used to question. Now we consume. The shift is subtle but powerful. What once required effort is now automated and handed to us.

This isn’t just about LLMs. It’s about the wider digital environment that encourages passive interaction. AI just happens to be the most advanced form of that so far.

We’re not incapable. We’re disengaged. And the more we outsource thinking without reflection, the less practice we get in the most human skill of all, thinking.


The Importance of Friction

Friction isn’t a bad thing, it’s usually the catalyst for growth. When you struggle to write, you become a better communicator. When you sit with a difficult question, you sharpen your mental edge. Friction is where skill is forged.

AI removes friction. That’s its superpower and its risk. When everything becomes easy, the hard stuff like resilience, discernment, and focus starts to fade.


So When Do We Feel the Impact?

The short answer: we already are.

The longer answer: it’ll likely show up gradually in our shrinking attention spans, discomfort with uncertainty, and increasing dependence on summaries and “quick takes.”

You’ve probably felt it yourself. The urge to skim, the itch to check your phone, the resistance to sitting with a challenging idea.

I’m not a neuroscientist. This isn’t a peer-reviewed claim. But I’m not the only one sensing the shift.

The tech is still new. But like all powerful tools, the real effects may show up long after the initial hype fades.


What Do We Do?

This isn’t a call to abandon AI. It’s a call to use it with intention.

If you ignore AI, you risk being left behind. But if you use it without reflection, you risk letting it shape how you think without even realising. That trade-off isn’t hypothetical, it’s happening in real time, and for most people, it’s invisible.

The goal isn’t to be anti-AI. It’s to be pro-agency.

Just like we’ve learned to care for our bodies, brushing our teeth, eating (mostly) well and going to the gym, we now need daily habits to protect and sharpen our minds. Especially now, in a world where the default is outsourcing thought.

This might mean:

  • Reading long-form content without skipping to the TL;DR.
  • Making notes to process your own thoughts before outsourcing your thinking to an AI model.
  • Wrestling with a hard concept before prompting for an explanation.

None of this means rejecting help, it just means choosing when to struggle first and when to reach for support.

And let’s be real: not everyone will feel the impact equally. People who aren’t familiar with how these systems work, or who haven’t built strong habits around focus and critical thinking, may be more vulnerable to their downsides. That’s not a personal failing. It’s a systemic challenge.

So here’s a question: Is it crazy to suggest that the people building these tools, engineers (like myself), designers, product managers, consider the ethical and cognitive impacts of what they create?

We expect that kind of responsibility from doctors. From educators. Even from architects2.

If software is shaping how billions of people think, feel, and behave…shouldn’t builders take some responsibility too?


Cognition Culture

There’s no point raising concern without offering a way forward.

One powerful practice is what Cal Newport calls “deep work” — carving out space for focused, difficult, and meaningful tasks. Research consistently shows this kind of effort boosts concentration, learning, and problem-solving. This article from Asana talks more about this.

Other proven habits include:

  • Deliberate practice: pushing your limits to stretch your capability.
  • Physical exercise: especially aerobic movement, shown to boost executive function.
  • Mindful journaling: processing your own thoughts deepens clarity and reflection.
  • Reading physical books: especially challenging material, it forces presence and builds endurance.

These aren’t just wellness tips. They’re mental hygiene. Habits that protect and develop your mind, just like brushing your teeth protects your smile.

We built a fitness culture around the body. Now we need a cognition culture for the mind.

Because AI isn’t going anywhere. But how we meet it, passively or actively, will shape the kind of thinkers, creators, and humans we become.


  1. “Thinking” isn’t a single skill, it’s a spectrum (critical, analytical, creative, etc.). ↩︎

  2. Of course, not all doctors, educators, or architects are responsible for cognitive or ethical outcomes in every context. The point is that in many professions, especially those that impact people at scale, we expect a level of responsibility outside of technical competence. ↩︎