You do not need to know how it works. That is exactly the problem.
What the man at the protest did not know. And what we are building into the people who come after us.
This is not a post about AI. It is a post about what happens to people who never ask why.
I was at a protest a few years ago when someone told me the internet should be a human right.
He was holding a four thousand pound laptop. A coffee in his other hand. Earbuds in. Passionate, articulate, certain.
I asked him why.
He said because the internet means freedom of speech. Because it gives us free education. Because without it life would be very boring.
I asked him again. Not what the internet offers. What the internet actually is.
Good to know…
While the internet was never meant to be a human right. That was not in the original documents. But something shifted.
Now the UN and others are noticing something we already know. You cannot exercise freedom of expression without access. You cannot learn. You cannot build anything. Not anymore. The internet became the medium through which rights are exercised. And then someone cut the wire.
When a government disrupts internet access, they are not just turning off a service. They are silencing people. They are closing schools. They are shutting down the only marketplace where people without privilege can compete. The UN saw this. They named it a violation.
Here is what troubles me. We framed the fight as “is internet access a right.” But the real question is different. We already know what the internet means. We already know what happens when it disappears. So why did it take us this long to say out loud that cutting people off from it is a choice. And every choice has a side.
[Cont…]
He described Facebook. YouTube. The ability to speak to friends.
I said: that is the world wide web. What is the internet?
He looked at me. Then he said: what is the world wide web?
I got scared.
Not because he did not know. Because he was screaming about a thing he had never once tried to understand. And he was not alone. He was surrounded by people doing exactly the same thing.
I walked home that evening past a row of bins waiting for collection. The smell of rain on concrete. I kept thinking about that man. About the specific shape of his certainty. How complete it was. How untouched.
What the internet actually is
Here is what the internet is. Simply put, infrastructure. Physical cables, servers, routing systems that cost billions to build and billions to maintain. In most countries it is partly subsidised by taxes. It is not free. It has never been free. The world wide web sits on top of it. That is what you see when you open a browser. The web is the passenger. The internet is the road.
None of this is complicated. It takes about ten minutes to understand. But most people who depend on it daily, who would describe losing it as losing a human right, have never spent those ten minutes.
I am not saying this to be superior. I am saying it because the gap between using something and understanding it is becoming one of the most dangerous gaps in the world. And we are widening it deliberately.
Not through malice. Through design.
The tyre in the boot
Think about the person who has never changed a tyre.
You have the car. You have the tools in the boot. They came with the vehicle. You have two working hands. But you have never once tried, never once been curious, never once spent the twenty minutes it would take to know. So when the tyre goes on the motorway you wait two hours for someone else to do it.
That is not a technology problem. That is a dependency problem.
And we are building an entire civilisation of people who have never changed the tyre. Who do not even know the tools are in the boot.
The phone you are holding is not a phone. It is an attention system. Engineered to be as frictionless as possible. Not for your benefit. Because friction creates the conditions for thought, and thought creates the conditions for demand, and demand is expensive. The easier the tool is to use, the less you need to understand it. The less you understand it, the more dependent you become. The more dependent you become, the more useful you are to the people who built it.
That is not a coincidence. That is a business model.
The same model runs through most of what we call the digital economy. Make it simple enough that questions never form. Make it rewarding enough that the absence of questions feels like comfort rather than loss. Make the dependency so total that the idea of stepping back feels like deprivation.
And then, when someone finally does step back and ask how it works, make them feel stupid for not already knowing.
The teenagers who laughed
I sometimes tell teenagers I do not know how to use Instagram. How to navigate TikTok.
They laugh. They say, it is so simple, how can you not know, are you that old.
I say, you are right. It is idiot proof. Give me five minutes and I will be at your level. But let me ask you something first. Do you know how I could access your email through your browser session? Do you know what happens when an app on your phone requests location permission and you press allow? Do you know that the fitness tracker on your wrist is broadcasting data about your heart rate to servers you have never heard of, governed by terms and conditions you clicked through in thirty seconds at 11pm?
They go quiet.
I am not trying to frighten anyone. I am trying to say, the reason these tools feel simple is because someone worked very hard to make sure you never had to think about what is underneath them. That work was not done for your benefit.
It was done so you would never ask.
And if you never ask, you never push back. And if you never push back, the people who built the system can keep building it however they choose. Which brings me to the part that actually worries me.
The people telling you not to think
I have heard tech leaders, people with platforms and influence, say that you no longer need to think for yourself. That AI* will do it. That coding will be irrelevant by the end of the year. That some large language model will make human intelligence optional.
I have heard tech leaders say that you no longer need to think for yourself. That AI will do it. I want to be precise about what I mean when I say AI here. Not intelligence. A large language model that predicts the next word based on patterns in human text. Extraordinarily capable. Not thinking. The distinction matters, because the people telling you AI will replace your judgment are often the same people who benefit most from you believing it.
I feel something close to disgust when I hear this.
Not because AI is not powerful. It is. Not because the tools are not useful. They are. But because the people saying these things are not describing a democratic future. They are describing a world where the ability to think critically, to understand what you are holding, to ask why it works and who benefits, becomes the exclusive property of a small group who kept that knowledge for themselves while telling everyone else it was unnecessary.
Intelligence as a sorting mechanism. Understanding as a privilege. Dependency as the default condition of most human beings while a smaller group retains the ability to question, to build, to decide.
That is not progress. That is the oldest power structure in human history with better marketing.
And I have watched enough cycles to recognise the pattern. The more a technology is positioned as inevitable, the more urgently you should ask who benefits from that inevitability feeling settled. The more you are told you do not need to understand something, the more important it becomes to understand it.
This is not paranoia. This is the minimum condition for being a free person in a technical world.
What this has to do with leadership
Here is where it becomes personal for me. And for anyone reading this who is responsible for building teams, building products, building systems that other people will live inside.
The same dependency trap that catches the man at the protest, the teenagers with their phones, the workers who have never once asked how the algorithm that governs their feed actually works, that trap is being built by tech leaders. By us, or at least a large majority.
Every time we optimise for engagement over understanding. Every time we make a product simpler in a way that removes agency rather than friction. Every time we design a system that works so smoothly that nobody ever has to think about what is underneath it. We are widening the gap.
I have sat in enough product reviews to know that this rarely feels like a moral decision in the moment. It feels like good UX. It feels like serving the user. It feels like getting out of the way.
But there is a difference between removing unnecessary complexity and removing the conditions for thought. And in my experience, most tech leaders have stopped asking which one they are doing.
This is the Stewardship question. Not, does this product work? But what kind of relationship does this product create between the person using it and their own capacity to understand and question the world?
Most of the products being built right now are answering that question in the same direction. Toward dependency. Toward seamlessness. Toward a world where the tools think for you because you have long since stopped expecting to think for yourself.
And the people building those products are, many of them, people who know exactly how they work. Who can change the tyre. Who understand the difference between the internet and the web. Who are making a choice, consciously or not, to keep that understanding for themselves.
What shared leadership changes
I want to be careful here not to make this abstract. Because there is a practical version of this argument that matters to the people I work with.
Shared leadership, the thing this publication is built around, is partly about distributing authority. About putting the map on the table and letting the room navigate together. But it is also about something more fundamental it is about building teams where people understand what they are building.
Where the engineer knows not just how to write the code but what the code is doing to the person on the other end. Where the designer knows not just what converts but what the conversion is costing. Where the leader holds the whole picture, including the parts that are uncomfortable, and brings that picture into the room rather than managing it from above.
That kind of team does not optimise blindly for engagement. It asks why. It slows down before it scales. It treats the person using the product as someone who deserves to understand what they are holding, not just someone whose behaviour needs to be shaped.
This is not idealism. This is what responsible technical leadership actually looks like in practice. And it requires, as a foundation, that the people doing the leading are willing to be curious about what they do not yet know. That they model the question-asking they want their teams to practice. That they resist the comfort of the idiot proof as a professional standard.
Because if the leaders stop asking, the teams stop asking. And if the teams stop asking, the products stop being honest about what they are.
What Gen Z can do today
I want to end with something direct for the younger people in the room. Because this is intergenerational in both directions. We built this world. But you are going to have to live in it longest.
What you can do is not complicated. It does not require becoming an engineer or a security researcher or a policy expert. It requires only this, curiosity about the tools you already hold.
Put the phone down occasionally and let your brain do the work. Be curious about the things you depend on. Ask what the app is actually doing when you give it permission. I want to end with something direct for the younger people in the room.
You did not build this world. We did. You are going to live in it longest.
The one thing I want you to take from this is not a list of things to learn. It is a single question to carry.
When someone tells you that you do not need to understand something that the AI will handle it, that thinking for yourself is an inefficiency to be optimised away ask who benefits from you believing that.
The answer will tell you everything.
Knowledge is not power in the old sense. Not as a weapon or a competitive advantage. But understanding what you are holding is the minimum condition for being a free person in the world we are building together.
And we are building it for them. The least we can do is tell them the truth about what it is.
If this landed, the Tuesday posts come by email. Free. Comment and subscribe below and the next one arrives in your inbox.
About the Author
Tino Almeida is a tech leader, coach, and writer reshaping how we think about leadership in a burnout-driven world. With over 20 years at the intersection of engineering, DevOps, and team culture, he helps humans lead consciously from the inside out. When he’s not challenging outdated norms, he’s plotting how to make work more human, one verb at a time.



Gen Z thinks internet access is a human right. But they're growing up inside systems they didn't choose and can't escape. What's the responsibility of builders who know the lock-in is deliberate? Genuinely asking. I don't have this answer.
During my research while writing this piece, I found this news from last year:
"According to recent research, Generation Z firmly believes that internet access should be recognised as a "human right"."
https://www.cambridge-news.co.uk/news/uk-world-news/internet-access-human-right-tendendo-31052406