

(continued)
China is joining in with AI
Last month, the New York Times reported on a new disinformation campaign. “Spamouflage” is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.
As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”
The influence networks are vastly more effective than platforms admit
Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika’s operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.
But how effective are these efforts? By 2020, Facebook’s most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn’t just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.
It’s not just false facts
The term “disinformation” undersells the problem. Because much of Russia’s social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.
Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that’s how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.
As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it’s not just low-quality bots. Per RAND,
Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. … According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.
What this means for you
You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It’s not just disinformation; it’s also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.
It’s why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, “professional” writers whose job is to sound real.
So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you’re talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you – politically or emotionally.
Here are some thoughts:
-
Don’t accept facts from social media accounts you don’t know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they’ll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.
-
Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it’s easy to think the crowd is right. But “the crowd” could be fake accounts, and even if they’re not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They’ll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.
-
Don’t let social media warp your view of society. This is harder than it seems, but you need to accept that the facts – and the opinions – you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.
Holy shit, I remember playing NationStates as a youngling, and I think I read the book too? Idk, it’s been 20 years.