Lately, I’ve caught myself just sitting and watching my feed melt down in real time, thread after thread, comment after comment, people piling on with insults, outrage, and the occasional all caps conspiracy theory.
And I keep wondering: are these just random outbursts from angry people with smartphones? Or are we looking at something much bigger, more intentional, and frankly, more dangerous?
Turns out, it’s most probably the second one.
According to real research, not just vibes, the hate we see online isn’t just emotional noise, it’s a system. A pretty well-oiled one (Frontiers in Psychology, 2021).
This kind of content is choreographed, boosted by algorithms, and deployed with precision.
Especially around elections. Because nothing supercharges online conflict like a countdown to power.
Researchers have actually tracked this (npj Complexity, Nature Portfolio, 2024).
In one study, connections between hate groups online jumped over 40 percent on the day of the 2020 U.S. presidential election (npj Complexity, Nature Portfolio, 2024).
When the results came in, that number climbed to nearly 70 percent. That’s not just people venting. That’s organization in motion.
Telegram, one of the least moderated platforms, more or less turned into a digital war room.
And it gets worse.
See, the outrage? It doesn’t just explode. It performs. Makes money.
Every heated reply, every viral insult, every meme with that “just asking questions” energy, it all feeds the same machine. More engagement equals more ad revenue.
What we think of as social media drama is, functionally, a business model.
These platforms aren’t broken, they’re working exactly as designed.
Outrage sells. And when it’s extreme, it gets pushed harder.
That’s not speculation, psychologists have mapped it, social scientists have measured it (Frontiers in Psychology, 2021; PNAS Nexus, 2023).
And here’s the part that’s stuck in my head: it doesn’t even matter if the hate is real.
Sincerity isn’t part of the equation.
The algorithm isn’t checking vibes, it’s counting clicks. So whether someone’s being serious or trolling or throwing a digital tantrum, it’s all the same.
Behind every nasty comment, every dog whistle video, every trending smear, there’s often more at work than some bored dude with a burner account.
There are coordinated teams. PR playbooks. Even entire campaigns designed to confuse, manipulate, and turn people against each other (UNESCO Report, 2022; OHCHR, 2021).
Sound extreme? Pretty much every major agency now confirms it.
UNESCO, the UN, the OHCHR, they’ve all warned that online hate doesn’t just sway opinions.
It suppresses votes. Especially from marginalized and minority communities (UNESCO Report, 2022; OHCHR, 2021).
We’re not talking about a few trolls. We’re talking about voter suppression in meme form.
What used to be political discourse has turned into identity warfare.
And debate?
That’s gone. It’s not what you say, it’s who you are.
You’re either “one of us” or “the enemy”. No room in between.
Studies even map the timing.
One from PNAS found spikes in online aggression directly tied to real world political events, especially elections (PNAS Nexus, 2023).
It’s not random. It’s engineered like a campaign launch.
Of course, we all tell ourselves it’s just people expressing opinions online.
The digital barroom debate.
But take a breath. Zoom out a little.
The patterns are too neat, too consistent, way too effective.
I remember when we thought the internet would bring everyone closer, give a voice to the voiceless, democratize everything.
And it did, in a way.
For about five minutes.
Then the same tools that let people speak also let the worst voices be heard louder, faster, and endlessly repeated at scale.
Right now, we have more than enough evidence, peer-reviewed studies, election reports, data going back a decade, showing that if you want to mess with a democracy, you don’t need to win hearts and minds.
You just need to drown out the signal (European Commission, 2023; UNESCO Report, 2022).
Flood the zone. Confuse, divide, enrage.
Then log off and let the algorithm finish the job.
And honestly, we’ve seen this movie before.
Weren’t the Romans the ones who figured it out first?
Dividi et impera (divide and rule).
Solid strategy, right?
Keep the people arguing and distracted, and you can govern from behind the curtain without too much trouble.
But… history didn’t exactly prove that tactic to be a success.
Sure, they ruled. For a while.
Then came the decay. The fractures. The barbarians at the gate.
And eventually, the collapse of the empire that thought it had control over the chaos it created.
References & Further Reading
- How U.S. Presidential Elections Strengthen Global Hate Networks – npj Complexity, Nature Portfolio, 2024
- The Event-Driven Nature of Online Political Hostility – PNAS Nexus, 2023
- Defining Online Hating: A Systematic Review – Frontiers in Psychology, 2021
- Disinformation and Hate Speech in Global Elections – UNESCO Report, 2022
- Hate Speech and Elections – OHCHR, 2021
- New Technologies and Electoral Integrity in Europe – European Commission, 2023


