Fake news and artificial intelligence, the new weapon of mass manipulation

There is an increasingly widespread, silent sensation that is difficult to focus on. We open the phone, scroll through images, read sentences, watch videos that seem real, credible, almost familiar. Everything appears real. Everything seems possible. Yet something is out of place, like a note out of place in a song we know by heart.

Fake news and artificial intelligence are changing the way we perceive the world, without fanfare or warning. They do it by entering everyday life, in social media, in the news we read while waiting for coffee, in the videos we watch in the evening on the sofa. The point is no longer about the single hoax. The point concerns the mental climate in which we live.

Fake news and artificial intelligence in everyday life

Falsehood today no longer has the crude face of a lie. It has the reassuring quality of normality. A quote attributed to a famous person. A video of wild animals that looks like a documentary. A photo that tells an emotional and immediate story. Everything works because it speaks the language of emotions, not that of verification.

For years, scholars and psychologists have explained that people tend to believe what “rings true”, what aligns with what they already think or feel. This mechanism, called truthinesshas existed long before social media. Technology has made it faster, more visual, more persistent.

A recent study published by Oxford University Press, False: Why an Untruth Is More Influential Than the Truth, tells a simple and disturbing thing. A well-constructed falsehood sticks with you more than an accurate truth. The human mind remembers better what excites, what is repeated, what appears familiar. Artificial intelligence works exactly on this level. Produces fluid, coherent, convincing content. He multiplies them. It adapts them. It makes them difficult to distinguish from the rest. So trust wears out little by little: it frays like a thread pulled too many times.

Artificial intelligence as a factory of plausible realities

Deepfakes represent the most visible step in this process. Videos in which real people appear to say or do things they have never said or done. Images edited with such precision that they overcome the distracted gaze. Texts written by automatic systems that perfectly imitate human language.

The problem does not only concern politics or major international events. It’s about normality. It’s about the habit of trusting images. It’s about the tiredness of controlling. It’s about the desire to believe something that makes immediate sense of what we see.

Even the academic world has begun to deal with this scenario. Scientific articles with invented quotes. Apparently solid texts built by artificial intelligence systems that “hallucinate” data and sources. Everything appears orderly, logical, credible. The shape holds up. The substance slides.

In this space, what many scholars define as a true “parade of the unreal” takes shape. Content that attracts attention, generates reactions, orients emotions. The truth ceases to be central. What counts is what works.

When doubt becomes mental tiredness

Many years ago Hannah Arendt described a profound risk. A society exposed to a continuous distortion of reality loses the ability to distinguish, judge and choose. Not because he believes everything, but because he stops believing in something. It is a form of cognitive fatigue. A silent renunciation. When everything can be false, everything weighs less. Even what really matters.

AI accelerates this process because it works at scale. It produces a lot. Produces quickly. Produces in a personalized way. Each information flow becomes a tailor-made environment that confirms emotions, strengthens beliefs and avoids friction. Disinformation does not scream: it accompanies, it camouflages itself, it becomes the background.

Living immersed in this scenario requires a new form of attention. Not the obsessive one of continuous control. The more subtle one of recognizing how we feel when faced with content. Involved. Get angry. Comfortably reassured. Fake news and artificial intelligence tell a lot about us. They tell the need for simple stories. They talk about the struggle of remaining in doubt. They talk about the desire to trust.

Keeping technology, emotions and critical sense together becomes a daily practice, similar to learning to recognize a familiar smell in the air. No magic formula. Just a slightly more present look.

You might also be interested in: