11 Comments
User's avatar
Diamantino Almeida's avatar

Large Language Models (LLMs) do not hallucinate in the human sense because they lack sentience and conscious thought. The term "hallucination" in AI refers metaphorically to instances where an LLM generates text that is plausible but factually incorrect or nonsensical.

This happens because LLMs operate as advanced statistical predictors, generating the next token based on patterns in vast training data.

These outputs are not errors due to faulty reasoning or intention instead, they reflect the probabilistic nature of language modeling and limitations in the data or training process.

Hallucinations arise from the model's inability to distinguish truth from fiction because it does not possess understanding or awareness. Thus, what is called hallucination is a byproduct of statistical prediction rather than conscious error or imagination.

Diamantino Almeida's avatar

Large Language Models don’t “hallucinate” like humans they simply predict words statistically, so factually wrong outputs are a byproduct of pattern-matching, not conscious thought or intent.

Diamantino Almeida's avatar

Some great articles to understand what is going on. Several people and institutions are working to protect us from misuse of AI.

https://www.theverge.com/ai-artificial-intelligence/782752/ai-global-red-lines-extreme-risk-united-nations

Diamantino Almeida's avatar

Great study and a good reminder of what can happen if we delegate too much.

https://arxiv.org/abs/2506.08872

Kathleen's avatar

Another excellent article!

Many worry about the lack of governance concerning AI use and production (LLM). And, as you pointed out, the motivation is monetary and monopolization, which then increases the monetary return. Data privacy is paramount as data is power. The pace of change driving AI acceleration is bypassing concerns for standards and legal obligation. It will be ugly until governments erect guardrail policy. The obscene consumption of electricity and water will/is accelerating 'pushback' from entire neighbourhoods. Currently Virginia has over 340 data centres. That's significant and drawing considerable attention.

Diamantino Almeida's avatar

Wow, 340 datacentres, that's a lot. But true, my concern is this narrative big tech and a few AI tech companies that we cannot stop and going forward no matter what. I believe that this is no ordinary technology and society must be informed and educated about it. Letting this in the hands to tech companies in my view is a big mistake. And I'm in technology.

It's obscene the quantity of energy, water and land that are required to run these machines, scale is not the solution. In tech we tend to minimise and make things as efficient as possible.

The Cosmic Onion's avatar

Falling asleep can kill. 🐺

That’s the real trap of the candy house—polite sentences and glowing screens rocking people into a trance while the business model fattens them up. An LLM is just a dice-roller, but the men behind the curtain aren’t rolling for fun. They roll for power, profit, and quiet control.

The danger isn’t that machines will wake—it’s that humans won’t. Stay awake. Question every output. Keep your claws sharp.

—RIB

Diamantino Almeida's avatar

We need to stay awake, educate ourselves, experiment and keep asking why.

Thank you for your comment.

Is not the technology that scares me is the people behind it, or at least a few...

Colette Molteni's avatar

The "god-complex" part had me reflecting. I, too, have observed this, unfortunately. And yes, the most successful leaders I have witnessed are not just tech first, but human first as well in their approach.

Diamantino Almeida's avatar

Indeed Colette, there are many good leaders out there, unfortunately, our media keep focus on the questionable ones. I keep wonder, when you get so much money that fast, and all from your "technical skills" and others less diplomatic ways, is not difficult to imagine that that will be their moral compass and true north. Especially in tech where most of the times we are push to deliver at all costs. And the people side gets forgotten.