“The real opposition is the media,” declared Steve Bannon in 2018, while serving as White House chief strategist during Donald Trump’s first term. “And the way to deal with them is to flood the zone with shit.” Bannon showed the way, and now Trump is returning to the presidency riding that same wave, this time fueled by his new chief strategist, Elon Musk, owner of the social media platform X. “Now you are the press,” Musk told his loyal Twitter followers. Like Bannon, Musk holds the media in disdain and is content to muddy the waters: more than half of his posts during the campaign were deemed “misleading,” according to CBS. Both understand that in today’s world, the only thing that matters is the narrative, the culture war.
There is one key difference, however: the information chaos of that era caught the world by surprise. From the U.K. to the Philippines, the political use of tools like Facebook and WhatsApp led to an unexpected political disruption. But that disruption is no longer unexpected — it is now the norm. “I don’t think misinformation is going away,” says Sander van der Linden, a researcher at the University of Cambridge. “Unfortunately, it is the new normal.”
The flood of toxic misinformation that swept through Spain following the more than 200 deaths in the Valencia flood disaster is a stark confirmation of the world envisioned by Bannon and Musk. Disinformation spreads unchecked across cell phones, social networks have become toxic, and the media has lost its credibility. Citizens, polarized and disoriented, point the finger at the other side, accusing them of lying. The information ecosystem, more fragmented than ever, has left society without a shared reality to build consensus or meaningful discourse.
As Renée DiResta from Georgetown University explains, we now have a truth tailored to each person: “The collision between the propaganda machine and the rumor mill has created a ‘choose your own adventure’ epistemology: some media have already written the story you want to believe; some influencer is demonizing the group you hate.”
Disinformation experts agree that what happened in Spain after the deadly floods is no coincidence, but rather the inevitable outcome of the new information system that has been reshaped by the Bannon doctrine over the past two decades. After the floods, Spain saw as many lies in two week as it did over the two years of the Covid-19 pandemic.
“We’ve never seen something so explicit and coordinated, but we are going to see it more often,” warns Clara Jiménez, who has been combating misinformation for over a decade as the head of the Spanish fact-checking site Maldita. “The disinformation machinery now has more muscle, but it also has more followers — more people who are used to consuming this kind of content,” the journalist explains.
The overwhelming support Spanish TV host Iker Jiménez received after years of spreading conspiracy theories is a disheartening testament to this trend. “In the last decade, we’ve witnessed the normalization of disinformation in our society,” says Raúl Magallón from Carlos III University. “It began in politics, then with immigration, and later during the pandemic with anti-scientific discourses. Now, everything has converged with the DANA [an autumn weather event typical of eastern Spain that caused the flash flooding in Valencia], which has been a perfect storm. What’s more, disinformation narratives have even reached teenagers,” he adds. The relationship of young people with reality and information is increasingly shaped by this confusing, fractured landscape.
It’s impossible to pinpoint the exact moment when this new distorted universe began, but it started to take shape long before people naively spoke of “post-truth.” Bannon, the head of the ultra-right website Breitbart, recognized that there was an audience hungry for alternative realities. He drew from the lessons of 2014′s Gamergate, when a sexist mob harassed women in the video game world via social networks. The man who would later become Trump’s campaign manager realized that political battles could be waged on the internet by activating tribal behavior with hatred and flooding networks with armies of trolls. Joan Donovan from Boston University explains: “Bannon discovered how to link the superficial with the profound in an unprecedented way, which gave him enormous influence in U.S. politics.” The media, however, didn’t know how to handle Trump or what his rise represented.
During these years, digital platforms — ranging from Google and YouTube to Facebook and Twitter — won the battle for attention against the media. They also claimed nearly all the advertising revenue. While the press was bleeding to death from closures and mass layoffs, the few surviving newspapers, desperate to stay afloat, surrendered to producing viral content for social networks. Meanwhile, those same tech companies unleashed the power of algorithms on users to fuel their exponential growth — without considering the consequences. And then, the incomprehensible began to happen.
One in four Americans believed the Sandy Hook massacre, in which 26 people were shot dead, was staged after agitator Alex Jones began promoting this conspiracy theory in 2014 to boost his income. In 2016, a man armed with a gun stormed a pizzeria in Washington D.C., convinced that a pedophilia ring run by Democratic politicians was hidden there — the infamous Pizzagate. The flags of QAnon followers, a conspiracy theory that spread and evolved into a sectarian cult through social networks, were waved triumphantly during the January 2021 assault on the U.S. Capitol. “QAnon would not have existed as it does today without Facebook’s inadvertent algorithmic recruitment […]. At its worst, Twitter made mobs — and Facebook grew cults,” writes Renée DiResta in her book Invisible Rulers.
The uproar caused by these scandals — including the attempted vote manipulation by Cambridge Analytica, where Bannon sat on the board — is now a thing of the past, as is Silicon Valley’s attempt to mend its ways. After the social media platforms’ terrible reputation crisis, their owners apologized and promised reforms to politicians around the world. But that era is over. “I am done apologizing,” Meta CEO Mark Zuckerberg declared in September. Meta, X, TikTok, and YouTube have revoked the policies that banned disinformation after Covid-19 or extremist speech following the Capitol assault. After the attack, “the main digital platforms and social networks did not implement relevant and specific actions to deal with the disinformation crisis,” warned Maldita in a report. Fake news and hate are once again spreading unchecked.
“The information diet of TikToks and headlines on social networks, with videos devoid of context, documentation, or authoritative voices, has no substance. This makes it harder to assess the veracity of information. Good journalism must be immunized so as not to be infected by bad practices and haste,” summarizes Loreto Corredoira, head of the Complutense Observatory of Disinformation.
A striking phenomenon during the aftermath of the floods in Spain was that among the thousands of testimonies gathered by television stations, victims sometimes repeated conspiracy theories that originated on platforms like Telegram. These videos were then circulated by disinformation channels as “proof” of the fabricated stories. “In a context of maximum uncertainty and fear, hoaxes, cultural battles, and alternative narratives emerge,” explains Magallón. “The DANA has activated these dysfunctions due to the lack of trust in the media and because social networks have become political actors with their own agenda.”
The latter point is crucial: Big Tech has cozying up to Trump for months. Jeff Bezos, Amazon’s CEO and owner of The Washington Post, prevented his newspaper from endorsing Democratic candidate Kamala Harris. Musk, meanwhile, wanted to make a scandal out of the fact that Twitter’s previous management had worked with the U.S. government to curb misinformation during the pandemic. Now, he has handed his platform over to the Republican electoral machine without hesitation. “The richest man in the world, owner of his own communication network that reaches hundreds of millions instantly, is a threat that governments must keep an eye on,” says Joan Donovan, founder of the Critical Internet Studies Institute and author of Meme Wars.
“They tell you, ‘We have the truth and the media is lying to you.’ Musk says it openly, but so do many others. In Spain, political parties, celebrities, and influencers repeat this. Drop by drop, year after year, they hammer home this message,” says Clara Jiménez. She warns: “This story has become deeply ingrained in many people. It took hold a long time ago: we are much more affected than we think.”
The ground was ripe, but there are also culprits: individuals with vested interests who plant the seeds for the mass production of lies. Their goal? In many cases, money. As an editorial in Nature pointed out: “Advert-funded model of much of the web has boosted its production. For example, automated advertising exchanges auction off ad space to companies according to which sites — including misinformation sites — people are looking at, and the sites receive a cut if users look at and click on ads.”
There’s fierce competition among misinformation influencers — whether through streaming or networks: the more outrageous the content, the more relevance it gains; the more visibility it receives, and the higher the income. In many cases, greed and political agendas align. “The current model of influencers and algorithms creates perverse incentives for the circulation of disinformation,” warns Van der Linden, author of The Psychology of Disinformation and Infallible.
The deception factory operates non-stop, launching memes and lies to see which one sticks. This happened after the murder of several girls in Southport in the United Kingdom this summer. An account on X falsely claimed that the criminal was a Muslim refugee named Ali Al-Shakati. The well-oiled machinery of hate spread the lie quickly. The British Parliament has summoned Musk to testify about the dissemination of these falsehoods, which stirred up public sentiment and incited unrest before the truth came to light.
The disinformation industry is always eager to fill the gaps in information with its own narratives. Sometimes it doesn’t work: after the death of a child in Mocejón in the Spanish province of Toledo, far-right politician Alvise Pérez tried to spread a similar falsehood, but it didn’t gain traction. In the U.S. elections, millions of falsehoods were circulated, and one peculiar claim took hold: that Haitian immigrants were eating the dogs and cats of the residents of Springfield, Ohio.
“Disinformation can launch a thousand pieces of content without much effort and wait to see which one sticks, which narrative catches on in the discourse: it’s like sending a thousand soldiers into battle and seeing which ones reach the goal,” says Jiménez. As DiResta explains, it can take journalists days to investigate and publish a refutation of such lies: “In the age of social media, that’s an eternity. By the time their version of events is published, the invisible forces behind these lies have already moved on to something else.”
In short, don’t let the crybabies in the media dissuade you, fellow patriots.
Keep the cat memes flowing.
— JD Vance (@JDVance) September 10, 2024
But for lies to reach their goal, there is one decisive factor: the elites. As Rasmus Kleis Nielsen, a specialist at the Oxford Reuters Institute, repeatedly says: “Disinformation often comes from the top.” The misinformation industry releases falsehoods non-stop, but an absurd claim like the one about the Ohio cats only truly gained traction when it was appropriated by the likes of Musk, Trump, and vice president-elect J.D. Vance. “Studies show that most disinformation comes from super-spreaders who, in the political sphere, tend to be party elites,” says Sander van der Linden.
In more analog times, the false allegation that the 2004 Madrid train bombings caused by the Basque terror group ETA took hold among the right-wing community because the management of the conservative newspaper El Mundo and political group the Popular Party (PP) supported it. Trump, during his first campaign, appeared on Alex Jones’ show — the one promoting the Sandy Hook conspiracy — and praised Jones for his “magnificent reputation.” Kamala Harris laughed at Trump’s claims about Haitian migrants during the presidential debate, though, at a certain level, it didn’t matter if it was a lie. Vance acknowledged it was probably false but stated that the important thing was spreading the (xenophobic) narrative. “Don’t let the crybabies in the media dissuade you,” the future vice president shared on X. “Keep the cat memes flowing.” This served as a metaphor to convey the underlying idea: immigrants are dangerous, and their customs disrupt the American way of life. Ultimately, such cases hijack the public debate.
Why do these narratives work, even when we know they are false? Because the platforms, whether intentionally or unintentionally, exploit human psychology to perfection. We’re still trying to fully understand the mechanisms, but recent studies show that, even without an algorithm, social networks are toxic — and that we even knowingly spread disinformation. The desire to belong to the group is more powerful: when we see a dubious claim that benefits our tribe, our social brain kicks in, not our rational one. We stop thinking about whether it’s true, and instead focus on how it will make “our people” feel. When we spread the fake photo of a Haitian with a pet, our side cheers, and the other side is outraged: it’s a win-win situation.
But the phenomenon of misinformation is still incredibly complex, and even experts cannot agree on how to define it. The Chicago Tribune published this headline during the pandemic: “A ‘healthy’ doctor died two weeks after getting a Covid-19 vaccine; CDC is investigating why.” The statement was factually correct and appeared in a reputable newspaper. But the misinformation industry seized on it, taking it out of context on Facebook and using it to fuel anti-vaccine discourse during a time of great uncertainty. That headline was viewed more than 50 million times in the U.S., and similar posts led to three million Americans opting out of getting vaccinated, according to a study published in Science. The key insight came from researcher Kate Starbird of the University of Washington, who explained: “Misinformation is not a piece of content. It is a strategy.”
There is still much to learn. While countless studies on the phenomenon have been published in recent years, only 1% have been conducted in real-world settings and analyzed the tangible behavior of individuals. It doesn’t help that politicians continue to polarize the issue and criticize experts, often hijacking the disinformation agenda for their own purposes. This happened years ago with the term “fake news,” which Trump weaponized against journalists. Spanish Prime Minister Pedro Sánchez spoke of the “sludge machine,” referring solely to right-wing media and only when it personally affected him — such as when preliminary proceedings were opened against his wife for influence peddling. A staggering 80% of Spaniards view disinformation as a problem, and Spain’s National Security Council lists it among the main threats.
On the very same day that the floods hit Valencia, October 29, Steve Bannon was released from prison after four months behind bars for contempt of Congress. He said he felt more “empowered” than ever, and was focused “on [Trump’s] victory” at November 5 election. He spread more lies on his podcast, and a week later, Trump won again. But this time, it wasn’t a surprise — he won without a hitch. Just another symptom of the new normal.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition