AI Is Stealing People's Brains
I came across a suggested prompt from Copilot today, and it immediately made me think: "Are people thinking less now thanks to AI?"

These types of issues aren't those that show up immediately. Like ADHD, it creeps in until you can't pry people away from the products that are literally causing their issues. Before I explain, let me show you the simple prompt that caused me to think about this:
"At $50 per unit, how many units does my company need to sell to break even if it has $250,000 in fixed costs and at $20 per unit in variable costs?"
It seems simple. An honest question to AI, even harmless, really, but I immediately thought, this is the type of question I used to handle manually with a calculator or in Excel in under 60 seconds. The formula is quite easy:
250,000/(50-20)
But people aren't really thinking for themselves anymore. It reminds me of the ADHD epidemic that's currently ravaging our kids and even adults.
You see, the cause of ADHD in most is quite easily identified. When I'm coaching Little League or even serving on Sunday morning with the kids, you can always identify them. They're the kids who are bouncing off the walls. Constantly interrupting you, and they almost always talk about the same things: YouTube, TikTok, Roblox, or another video game. It's no surprise these kids can't sit down for 5 minutes and listen to you talk. When you're in an environment where your attention is constantly shifting, you've never learned how to stop and focus.
Who's to blame for the ADHD epidemic? The businesses that are doing exactly what a business wants to do? The parents for not sitting their children down and making them focus on anything that lasts for more than 10 minutes? That's a debate that I'm not going to get into right now. But we're starting to see a very similar problem with AI.
The average user is thinking less, and it's going to cause a K-shaped intelligence issue.
The K-Shape is an idea that really comes from economics, where things are working out really well for some, but really poorly for others, and AI is going to do the same thing for people's thought processes.
Some AI users will use ChatGPT, Gemini, and Copilot to remove the remedial work: formatting of emails, boilerplate code, etc. While others will allow AI to think for them.
But you don't have to take my word for it. Research published in the Harvard Gazette and IE.EDU supports it too.
I don't think there's a solution to this problem. But imagine have 50% or more of the population not thinking for themselves and simply believing whatever their AI of choice tells them to think, or worse, tells them how to vote.