A reflection on AI, power, and the path of least resistance
... the way we’re being shaped, not by the tools we use, but by the hands that control them.
There’s a quiet unease that settles in after spending too much time with AI not because the technology itself is inherently sinister, but because of what it reveals about us. About me.
About the way we’re being shaped, not by the tools we use, but by the hands that control them.
I’ve spent the last year immersed in AI, both as a user and as someone trying to understand its broader implications. What strikes me most isn’t the brilliance of the algorithms or the speed at which they can generate text, images, or insights. It’s the way they feel how they slip into my workflow, my thoughts, even my decisions, not as a neutral assistant, but as something subtly directive.
The suggestions that autofill my sentences, the recommendations that populate my feeds, the nudges that guide my choices they don’t just help me. They steer me. And the more I rely on them, the more I wonder.
Who decided this was the direction I should go?
The narrative we’re sold is that AI is a democratizing force, a way to level the playing field by making expertise, creativity, and efficiency accessible to everyone.
And in some ways, it is. I’ve watched friends with no coding experience build apps with AI assistants. I’ve seen small businesses use generative tools to compete with corporate marketing teams.
But democratization, as it turns out, is a messy, uneven process. When a handful of companies control the infrastructure of these tools, “democratization” starts to look less like empowerment and more like a mass experiment in cognitive manipulation.
I believe in some way we are wired to take the easy path. It’s not laziness it’s survival.
And certain companies are exploiting us.
Our brains conserve energy by defaulting to habits, to patterns, to the familiar. AI, as it’s currently deployed, doesn’t just exploit this tendency it amplifies it. Every time an algorithm suggests what I should read next, what I should buy, or even how I should phrase an email, it’s not just saving me time. It’s training me to expect convenience, to prefer the frictionless, to outsource my own judgment.
And the more I accept these suggestions, the less I question them.
The real problem isn’t the technology itself. It’s the business models that have grown up around it. AI isn’t just a tool it’s a product, and like all products, it’s designed to maximize engagement, dependency, and profit.
The companies that build these systems don’t make money by making us smarter or more autonomous. They make money by keeping us clicking, scrolling, consuming.
The easier the path, the more data they collect.
The more data they collect, the better they can predict and influence our behavior.
I’ve noticed this in my own work. When I use AI to draft emails or generate ideas, I’m often struck by how generic the outputs are. Not because the technology lacks sophistication, but because it’s optimized for broad appeal, for the lowest common denominator.
It’s not trying to help me think differently it’s trying to help me think predictably. And the more I rely on it, the more my own voice starts to sound like everyone else’s.
This isn’t just about creativity. It’s about agency.
When the default settings, the autofill suggestions, and the “recommended for you” options are all designed to keep me within a narrow band of behavior, I start to lose the ability to stray from the path.
I stop asking questions. I stop pushing back. I stop thinking.
The conversation around AI ethics tends to focus on responsibility how to use these tools in ways that are fair, transparent, and accountable.
But responsibility implies choice, and choice implies control.
The question we’re not asking enough is.
Who actually gets to decide how AI integrates into our lives?
Right now, that power lies with a small group of tech giants, advertisers, and policymakers who are often more concerned with market dominance than with the well-being of the people using their products.
The result is a world where the easiest path is also the most exploited one. Where convenience isn’t a luxury it’s a cage.
I don’t think the answer is to reject AI outright. That ship has sailed, and besides, the technology itself has too much potential for good. But I do think we need to be honest about what we’re trading for that convenience.
We need to ask harder questions.
Who benefits when we outsource our thinking?
What do we lose when we let algorithms decide what’s relevant, what’s true, what’s worth our attention?
And perhaps most importantly, we need to reclaim some friction in our lives. Not all convenience is progress. Sometimes, the easy path is just the one someone else paved for us and it doesn’t always lead where we’d choose to go.
I’ve started to notice the moments when I pause before accepting an AI’s suggestion, when I deliberately choose the harder path just to remind myself that I can. It’s a small act of resistance, but it feels necessary.
Because if we don’t occasionally step off the well-worn trail, we might forget that there’s a world beyond it.
And that, more than any algorithm, is what scares me the most.
About the Author
Tino Almeida is a tech leader, coach, and writer reshaping how we think about leadership in a burnout-driven world. With over 20 years at the intersection of engineering, DevOps, and team culture, he helps humans lead consciously from the inside out. When he’s not challenging outdated norms, he’s plotting how to make work more human, one verb at a time.



Another thoughtful article. Interesting how AI likes to embed itself into one's work.
This is important for society to think deeply about and discuss together. I do feel we're being guided down a path that may not be in our collective best interest. Curiosity, however, is a strong pull. Societal expectations push hard.
I've engaged with AI for personal use more than professional and while it is highly interesting and occasionally beneficial, I find too often it is more narrow in understanding and responses than I imagined and as much "miss" as it is "hit" in helpfulness. I often depart feeling dissatisfied with the experience.
Plus, with any great advancement comes an intense increase in abuse of from nefarious people.