Disinformation for Profit: Scammers Profit from Conspiracy Theories | Social networks
Ohen Facebook removed dozens of groups dedicated to Canada’s ‘Freedom Convoy’ anti-government protests earlier this month, it didn’t do so because of extremism or conspiracies rampant within demonstrations. This was because the groups were run by crooks.
Networks of spammers and profiteers, some based as far away as Vietnam or Romania, had created groups using fake or hacked Facebook accounts in a bid to make money from political unrest.
That foreign networks of social media scammers have taken over a controversial political issue could feel a bit like a throwback. Prior to investigations into the operations of Russian troll factories during the US presidential election and culture war disputes over content moderation, one of the biggest challenges facing social media platforms was profiteers spreading fake and spam articles for easy money. Hundreds of websites impersonating US news outlets have been pushing their content on social media, reaping advertising revenue from the traffic they generate.
Platforms like Facebook have cracked down on such “inauthentic activity” since 2016, but the global misinformation industry remains. In recent years, these for-profit disinformation networks have taken advantage of the popularity of conspiracy movements and far-right groups online, creating content aimed at anti-vaccine protesters and QAnon followers.
“It can be a hugely lucrative industry for people in other parts of the world to watch the political climates in the United States and Canada very closely and then capitalize on current trends,” Emerson Brooking, senior research fellow at the Digital Forensic Research Lab at the Atlantic Council, told the Guardian. “If you’re looking for money and you measure success not by sowing discord in a country but by maximizing ad revenue, there’s still a lot of upside to these operations.”
Disinformation for profit
According to the researchers, it is difficult to know the exact scale of the for-profit disinformation industry, as it operates as part of an underground economy and comes in various forms. In addition to content mills and ad revenue schemes, there are also private companies around the world that are hired to create fake engagements or push political propaganda. In 2021 alone, Facebook said it removed 52 coordinated influencer networks in 32 countries that attempted to direct or corrupt public discourse for strategic purposes, according to a company report on inauthentic behavior.
Additionally, small networks can have an outsized impact if they effectively use online groups to organize and fundraise en masse. In the case of the Freedom Convoy accounts, many of the larger Facebook groups involved appeared to be run by fake accounts or content factories originating from numerous countries. Facebook eliminated the groups this month, but not before Convoy supporters raised more than $7 million in crowdfunding and garnered mainstream attention. (GoFundMe later deactivated the campaign).
A Bangladeshi digital marketing firm ran two of Facebook’s largest anti-vaccine trucker groups, according to Grid News, which had more than 170,000 combined members before the platform removed them. A Missouri woman’s hacked Facebook account created a network of several other pro-protest groups, collectively gaining more than 340,000 members within weeks. Other groups promoting American fallout from the Canadian protests came from Facebook accounts and networks based in Vietnam, Romania and other countries, Facebook officials told NBC News.
But recent research has shed light on how some of these for-profit disinformation operations work. A series of case studies from the Institute For Strategic Dialogue, a London-based think tank, detailed what it takes to run a lucrative online scam. One example was a DIY website called The US Military News.
The headlines on The US Military News are much like those you might find on a number of far-right outlets, with headlines like “Trump Wrecks Pence In Awesome Statement” and articles praising the trucker protests Canadians. A shop on the site sells Trump-related merchandise, including free US flags and Trump 2024 “Revenge Tour” commemorative coins. There are repeated appeals for donations all over the front page and attached to every article.
But despite the wall-to-wall American name and branding, the site has no connection to the US military, or the United States for that matter.. His domain is registered in Vietnam, and it’s unclear if he employs any writers or if the products he advertises even exist. The articles themselves consist of stock footage only, with an automated voice reading the plagiarized content.
A number of articles and headlines published on sites linked to the network veered squarely at the content of the QAnon conspiracy, presenting lies about military tribunals and Biden officials being sentenced to death. One site’s homepage highlights a range of anti-vaccine and pro-Trump conspiracy content, while also promoting an Amazon affiliate link to Trump’s Art Of The Deal book .
The Guardian contacted the email address under which The US Military News is registered, but received no response. According to ISD, The US Military News is just one of many sites that appear to be linked to the same Vietnam-based network.
In another ISD report, researcher Elise Thomas uncovered a network of dozens of Facebook groups and pages — which also appear to be linked to a small group of people in Vietnam — that were sharing plagiarized pro-Trump content aimed at conservative social media users. Taking articles from far-right conspiracy sites like The Gateway Pundit, the network created Facebook groups with names like “Conservative Voices” and amassed a large number of followers – sometimes tens of thousands of users. .
Although for-profit disinformation networks often monetize their audience by running ads on their websites, the network discovered by ISD appeared to grow its Facebook group members in order to potentially resell the groups themselves.
“It was the initial threat that the platforms were worried about,” Brooking said. “It wasn’t misinformation, you would characterize it as some kind of ad fraud or ad farming.”
The original “fake news”
In many cases, including the ISD case studies, there are no large sums of money coming from inauthentic Facebook groups and conspiracy sites. But for many operators based in countries with lower per capita incomes compared to the United States, earning a few hundred dollars a month streaming conspiratorial content is a significant payoff. One of the most lucrative Vietnam-related sites that ISD analyzed brought in about $1,800 a month from advertising alone, about 10 times the monthly per capita income in the country.
These scams have strong echoes of the increase in commercial misinformation online in 2016. Many of the people behind posts containing false claims such as “Pope Francis endorses Donald Trump” also came from out of states. States, often from a single small town in North Macedonia called Veles which was responsible for over 140 copycat news websites.
These quirky “fake news” websites capitalized on salacious headlines and social media algorithms that promoted posts with high engagement regardless of their content, leading creators to choose controversial political issues involving wartime hotspots. race, religion and culture to get the most attention on their sites and social media accounts. While strategies for evading content moderators have evolved, this playbook of monetization and disinformation conspiracies seems to have remained largely the same.
“This is what the threat of misinformation looked like even before we were talking about state actions,” Brooking said. “It’s interesting that this kind of older threat is now back in the spotlight.”