Misinformation is a Side Hustle
Summary
Misinformation has real world consequences. We saw that on January 6, 2021. But what if I told you that there is a whole industry of laborers paid to create misinformation as a side hustle. What if I told you that misinformation is created halfway across the world in order to get people to take one side or another in a celebrity trial or to stop the certification of an election.
Further Reading
-
Ong, Jonathan Corpus, and Jason Vincent A. Cabañes. “Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines.” ScholarWorks@UMass Amherst 74 (2018): 1-67, https://doi.org/https://doi.org/10.7275/2cq4-5396
-
Ong, Jonathan Corpus and Cabbuag, Samuel. “Pseudonymous Influencers and Horny ‘Alts’ in the Philippines: Media Manipulation Beyond ‘Fake News’” (2022). The Journal of Socio-Technical Critique 132, https://scholarworks.umass.edu/communication_faculty_pubs/132
-
Ong, Jonathan Corpus, and Jason Vincent Cabañes. “When disinformation studies meets production studies: Social identities and moral justifications in the political trolling industry.” International Journal of Communication 13 (2019), https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1110&context=communication_faculty_pubs
-
Ressa, Maria. How to Stand up to a Dictator: The Fight for Our Future. New York, NY: Harper, an imprint of HarperCollinsPublishers, 2022.
-
Marwick, Alice E., and Rebecca Lewis. “Media manipulation and disinformation online.” Data & Society (2017), https://www.posiel.com/wp-content/uploads/2016/08/Media-Manipulation-and-Disinformation-Online-1.pdf
-
Select January 6th Committee Final Report and Supporting Materials Collection, https://www.govinfo.gov/collection/january-6th-committee-final-report
-
Silverman, Craig, ProPublica, Craig Timberg, The Washington Post, Jeff Kao, ProPublica, and Jeremy B. Merrill, The Washington Post. “Facebook Hosted Surge of Misinformation and Insurrection Threats in Months Leading Up to Jan. 6 Attack, Records Show,” ProPublica, Jan 4, 2022, https://www.propublica.org/article/facebook-hosted-surge-of-misinformation-and-insurrection-threats-in-months-leading-up-to-jan-6-attack-records-show
-
Hao, Karen. “Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows.” MIT Technology Review, Sept 16, 2021, https://www.technologyreview.com/2021/09/16/1035851/facebook-troll-farms-report-us-2020-election/
-
Lanier, Jaron. Ten arguments for deleting your social media accounts right now. Random House, 2018.
-
Fallorina, Rossine, Jose Mari Hall Lanuza, Juan Gabriel Felix, Ferdinand Sanchez II,
-
Jonathan Corpus Ong, and Nicole Curato. “From Disinformation to Influence
-
Operations: The Evolution of Disinformation in Three Electoral Cycles.” Internews 2023,
Transcript
Dr. Susannah Crockford: An all caps post to a public Facebook group screamed, "Looks like civil war is becoming inevitable!" Continuing, "We cannot allow fraudulent elections to stand! Silent no more majority must rise up now and demand battleground states not to certify fraudulent elections now!" Another post read, "We are Americans. We fought and died to start our country. We are going to fight, fight like hell. We will save her! [heart emoji]. Then we're going to shoot the traitors!" These posts came from before and after the November 2020 US presidential election, which Joe Biden won and Donald Trump lost. They are cited in a ProPublica and Washington Post article that analyzes how Facebook groups questioning Joe Biden's win posted around 650,000 messages attacking the election, averaging 10,000 posts a day between the election and Jan. 6, 2021, when rioters broke into the Capitol Building in Washington, DC, trying to stop the certification of the election.
Media Clip (Footage of 1/6/21 from The Telegraph): ...but this is now effectively a riot. 49 hours declaring it a riot. Five to fifty be advised they are trying to breach and get into the Capitol.
Dr. Crockford: Other posts featured graphics of gallows and nooses and threats, not only of civil war, but of violence against specific political figures associated with the election. When the crowds on Jan. 6 surrounding the capital carried a makeshift gallows and chanted, "Hang Mike Pence!"
Media Clip: [Chanting] Hang Mike Pence!
Dr. Crockford: These social media posts came to life, taking shape in reality, in urgent and violent ways. The social media posts and groups were organized around hashtags and slogans such as 'Stop the Steal' and 'Stop the Count.' First used by Trump advisor Roger Stone in 2016, 'Stop the Steal' was a recurrent slogan used in Trump's political campaigns. The implication being that any election that Trump or his favorite candidates didn't win was stolen. Stone and his associates brought the slogan back out in 2020 to mobilize a national protest movement against Joe Biden's win.
Media Clip (Speaker 1): An internet battle cry-- 'Stop the Steal'-- has swept across inboxes, Facebook pages, and Twitter like an out-of-control-virus. The claims that Democrats stole the 2020 presidential election from Donald Trump are all false, but the truth means little to people inundated with lies.
Media Clip (Speaker 2): I believe that they tried to steal the election.
Media Clip (Speaker 1): 'Stop the Steal' may appear as a grassroots uprising, but it started more than four years ago, the brainchild of a political dirty trick artist and convicted liar who has pushed disinformation schemes for years: Roger Stone.
Media Clip (Speaker 3): 'Stop the Steal' is posting much of this material there is insurmountable, compelling, overcoming evidence of fraud.
Media Clip (Speaker 4): 'Stop the Steal' is actually a coordinated effort that has been revived twice by Roger Stone and allied political operatives in an attempt to gaslight the entire integrity of our voting and election process.
Dr. Crockford: Posts using both hashtags-- 'Stop the Steal' and 'Stop the Count'-- increased around Election Day with 'Stop the Steal' taking over in prominence when Trump failed to halt the vote counts in various states. As Trump and his minions filed claim after bogus claim in the courts to get the results thrown out, the rhetoric increased on social media, in groups, and via hashtags. There were two protests, one on November 14, and another on December 12, named 'Million MAGA Marches,' coordinated through 'Stop the Steal' Facebook groups. These online spaces enabled a rapid mobilization and coordination of political action against the election results above and beyond the standard grousing and complaining of sore losers. Alongside this activity, Donald Trump tweeted.He tweeted prior to and after the election. On November 4th, when vote counting was still underway, he tweeted, "Last night I was leading, often solidly, in many key states, in almost all instances, Democrat run and controlled. Then, one by one, they started to magically disappear as surprise ballot dumps were counted, VERY STRANGE, and the 'pollsters' got it completely and historically wrong." Then on November 5th he simply tweeted, "STOP THE COUNT!" And then, "All of the Biden claimed states will be legally challenged by us for voter fraud and state election fraud, plenty of proof- just check out the media. WE WlLL WIN! America first !" Now, as we all know, Trump lost, and all of his lawsuits challenging the result were thrown out, and he now faces indictments by the Department of Justice, among others, for inciting an insurrection. What these tweets did was provide encouragement and incitement to his followers. As he kept posting, so did they. It wasn't just social media spreading disinformation. It was the President of the United States using social media to spread disinformation that challenged his own election loss so that he could stay in power. The ProPublica and Washington Post investigation calls its estimate of 650,000 messages, 10,000 per day attacking the election, 'almost certainly an undercount.' They didn't include private groups, comments or posts on individual profiles. Only Facebook has access to that sort of data. Their article suggests Facebook de-prioritized their Civic Integrity Task Force, which had responsibility for policing groups that may have shared content that was harmful, albeit not illegal. The task force was shut down after the election with fewer groups removed by moderators after the election and in the run up to Jan. 6th. Facebook denies that their priorities shifted. They blame Trump and those working for him for pushing the narrative that the election was stolen. Their platform just happened to be one of the forums that was used. The Jan. 6th committee in Congress played audio from a Twitter (now X) employee who shared that they and others feared violence on Jan. 6th because of what they were seeing posted on the platform, the employee stated:
Media Clip (Speaker 1): I believe I sent a Slack message to someone that said something along the lines of, 'when people are shooting each other tomorrow, I will try and rest in the knowledge that we tried.' And so I went to-- I don't know that I slept that night, to be honest with you. I was on pins and needles, because again, for months, I had been begging and anticipating and attempting to raise the reality that if nothing, if we made no intervention into what I saw occurring, people were going to die. And on January 5th, I realized no intervention was coming, no there-- and even as hard as I had tried to create one or implement one, there was nothing. And we were at the whims and the mercy of a violent crowd that was locked and loaded.
Media Clip (Speaker 2): And just for the record, this was content that was echoing statements by the former president, but also proud Boys and other known violent extremist groups?
Media Clip (Speaker 1): Yeah.
Dr. Crockford: They realized that no intervention was coming from the social media platforms. Facebook has also faced criticism for only acknowledging and acting on problems when it's too late and harm and violence have already occurred. The messages and posts attacking the election were allowed to spread and private messages and group chats were used to coordinate action leading up to the events of Jan. 6th, all without much action to prevent or moderate by the social media platforms. Then Jan. 6th happened, and our feeds were filled with images of a violent and angry crowd breaking into the US Capitol Building, calling for the death of leaders such as Mike Pence and Nancy Pelosi, taking things from offices and hallways, and praying in the Senate chamber.
Media Clip: Let's all say a prayer. I thank you, Heavenly Father, for gracing us with this opportunity.
Dr. Crockford: The shock of these events still reverberates through the American political system as legal cases continue against the individual participants and now against Trump himself. Beyond the action of participants, serious questions have been leveled at the social media platforms. What role did this information on social media play in these violent events? Did election disinformation lead to an insurrection?
Dr. Crockford: Welcome to Miss Information, a limited podcast series created by me, Dr. Susannah Crockford, in conjunction with the Institute for Religion, Media and Civic Engagement and Axis Mundi Media. Miss Information was produced by Dr. Bradley Onishi and engineered by Scott Okamoto. Kari Onishi provided production assistance. Miss Information was made possible through generous funding from the Henry Luce Foundation. In Episode Six, we heard about 15-minute cities, an urban planning policy that sounds benign, keeping amenities within 15 minutes walk or bike at most neighborhoods, but that has been elaborated as a form of conspiracy theory, alleging a climate lockdown, where we will be restricted in our movements under the hoax of climate change. Today, we'll continue the theme of conspiracy theories as a form of disinformation. Asking, what is it that conspiracy theories and other lies on social media do, especially when they're used around elections? How significant are disinformation campaigns around elections? How frequent are they? Do people really believe them? Can social media really make people want to overthrow their government? In the 2020 US presidential election, the role of social media was an important enabling factor. It is hard to argue that the claims of election fraud would have spread as far and wide as they did without social media. Platforms also provided a means for coordination and later evidence for prosecutions. And it's clear that people working for these platforms knew this mobilization and coordination was going on. On Jan. 5th, the sense that something was going to happen on Jan. 6th was palpable, and that was obvious, even to casual observers. However, it's really important to note that it was not just anyone spreading these stories. Trump was. And at the time he was the president. He maintained a consistent level of support throughout his candidacy and presidency, and that famously immovable base stayed with him as he ratcheted up the lies and threats post election. It was his words and sentiments filtering down through his network-- people like Roger Stone, Ali Alexander, and Alex Jones. They were all involved too, spreading posts on social media, and then they were all at the capital. So, who's to blame? Election Disinformation is a live issue, and not just in the US. Brazil had its own angry populist authoritarian, Jair Bolsonaro, rise to power in Brazil on the back of WhatsApp and Facebook campaigns spreading misleading messages through group chats. But it was the Philippines that really pioneered networked disinformation. As in other countries, the 2016 Philippine Election swung away from liberal democracy toward authoritarian strongman politics, embodied by Rodrigo Duterte. The parent company of Cambridge Analytica, Strategic Communications Laboratory, helped rebrand Duterte as a strong, combative leader who could be trusted to defeat enemies at home and abroad. Distrust of liberal elites and the media was a pillar of Duterte's social media campaign, establishing himself as the only trustworthy political figure while normalizing angry and vengeful attacks against opponents. Sound familiar? The way that social media influenced elections in 2016 brought a series of populist authoritarians to power, raised a question. Is it social media that made this happen? Would there be a Trump without Facebook and Twitter? But this, I think, is a simplistic question that invites easy answers. It blames the medium for the message, and while we can't overlook social media's enabling role in rising authoritarianism around the world, there's more going on. People were sold a message that at least some of them wanted to hear, even if based on manipulation and falsehoods. Election disinformation on social media is not brainwashing, it's more like false advertising. You still have to want to buy into the con to go along with it. In 2016, Rodrigo Duterte was a mayor of a provincial city in the south of the Philippines, running for the presidency against the old political families that had run the country for generations. His campaign centered on ending crime and political corruption. He presented himself as a strong man and made crude jokes and offensive statements. A focus of his campaign was ending drug related crime. On the last day of his campaign, he said, "Forget the laws on human rights, you drug pushers, hold up men, and 'do-nothings.' You better get out, because I'd kill you." The explicit threat to end crime with more violence resonated with some because Duterte won. And the first killing in the drug war occurred a short distance away from his inauguration within hours. Daily reports of killings increased, dead bodies dumped in the streets of poor neighborhoods, some with cardboard signs labeling them as drug dealers, as warnings to eye witnesses.
Media Clip (Rodrigo Duterte): My campaign against drugs will not stop until the last pusher and the last drug lord are [sound of killing someone].
Media Clip (News Reporter): When the Philippines President made that promise, arguing his nation could become a narco state if drug use isn't stopped, masked men-- sometimes police, sometimes just thugs-- started killing with abandon. Murderers who suffocated their victims by wrapping their heads in tape, often writing 'pusher' or 'addict' on signs left with the bodies When corrupt cops were exposed for some of these killings, the offensive was put on hold, but it's back. Official orders to police are to try to just arrest suspects, but the blood still flows.
Dr. Crockford: As the violence increased, so did the misleading content on social media. In September 2016 Duterte declared a nationwide state of lawlessness after a bombing, which he blamed on the illegal drug trade. A five month old article about a bombing went viral to justify the policy among Duterte supporters. The article did not refer to anything that had happened immediately prior to the declaration. Disinformation was being used to justify the drug war. This story is told by Nobel Peace Prize-winning journalist Maria Ressa in her book how to stand up to a dictator. Ressa describes how she founded Rappler, an online news site in the early 2010s. It was an article from her site that went viral, and she was able to see the analytics first hand, showing that a five month old article went viral through being taken out of context and reposted by a network of accounts supporting the president. Ressa describes how she was able to follow the emergence of disinformation in real time. Prior to the election, Ressa began to see fan accounts working together in a coordinated way to make their chosen TV shows or actors go viral. They worked out how to 'crack the code,' in Ressa's words, of social media. Accounts organized to artificially make hashtags trend higher, sometimes hijacking whatever else was trending. Groups of young fans realized all they had to do was organize together to tweet 7000 times per minute to make a hashtag trend something that could be made easier by using automated accounts. This tactic was then adopted by corporate marketers to spread their ads, sometimes paying the same young people who by now had large followings on social media. By the time the election campaigns came along, those corporate marketing firms were able to sell their digital services to political campaigns, then the accounts could switch to posting about politics and spread messages to millions across social media. This was the emergence of what platforms call 'coordinated inauthentic behavior' on social media. Large accounts with thousands or hundreds of thousands of followers get paid to post sponsored content. But sometimes that content isn't labeled as an ad, as influencers are supposed to tag their paid content. And the same accounts that post about products can also post about politicians, elections, or simply about grievances in the country, because they are paid to do so by a political campaign. Digital campaigns were pivotal in many of the 2016 elections.
Maria Ressa: The drug war in the Philippines began in July 2016 and at that point in time, there was an average of eight dead bodies a night. You knew there was something bad going on. My name is Maria Ressa. I've been a journalist for more than 30 years. 20 of those years with CNN in Southeast Asia. I ran the Manila Bureau, the Jakarta Bureau, and in 2012 began Rappler. This little startup that's like now the target-- earned the ire of President Duterte. Not coincidentally, that was when social media-- Facebook was weaponized. The first people who were attacked were journalists, anyone who questioned the drug war. I watched democracy crumble in about six months in the Philippines, our dystopian present is your dystopian future.
Dr. Crockford: Social media was especially powerful in the Philippines, a country where, according to Maria Ressa, Facebook is the internet. Because of the way telecommunications works in the Philippines people are able to access the internet via Facebook for free. Filipinos spent 1.7 times more time on Facebook and Instagram than watching TV, and had 60% more Facebook friends than the global average. What political campaigns on Facebook were able to do was build a sense of what was happening in the country, which, according to Ressa, was largely based on lies. Lies repeated over and over as truth became facts in the online ecosystem. And Facebook did very little to stem the tide of misleading and false information. Comment threads got taken over by supporters of the President, attacking anyone who seemed to oppose his policies. An 'us versus them' narrative emerged.
Media Clip (Rodrigo Duterte): My campaign against drugs will not stop until the end of my term, that will be six years from now, until the last pusher and the last drug lord are [killing sound].
Dr. Crockford: People increasingly chose to be silent rather than to challenge and face online harassment. Meanwhile, an estimated 7,742 people were killed in anti-drug operations during Duterte's presidency. Killing drug users and dealers was largely accepted by Filipinos, according to Ressa, in part because of the information operation online to produce consensus for it. At the head was a political leader who was directing drug operations against his own people.
Media Clip (Speaker 1): You think you can fix these?
Media Clip (Rodrigo Duterte): Yeah, when I said I'll stop criminals, I'll stop criminals.and if I have to kill you, I will kill you, personally.
Dr. Crockford: Networked disinformation of the kind seen in the Philippines has been a problem for Facebook for nearly a decade now. In episode one we heard about the problems on Facebook that led to the genocide of the Rohingya in Myanmar. But it's not just Facebook. Social media platforms don't want to censor. They want to allow as much content to flow unimpeded as possible, because that's how their profits are generated. More engagement means more ads means more dollars for their shareholders. They are also limited by national laws, and this presents a problem. What if the law is determined by the person spreading the disinformation? To find out more, I spoke to Jonathan Corpus Ong, a disinformation researcher from University of Massachusetts Amherst.
Jonathan Corpus Ong: I'm Jonathan Corpus Ong. I'm Associate Professor of Communication at UMass Amherst. I was previously a Fellow at the Shorenstein Center at Harvard, doing misinformation research with John Donovan. And I'm a disinformation researcher and podcaster like you. The Philippines, we serve the world in terms of being the digital janitors of the internet. That's how Wired magazine calls us, at least. And so that was a starting point of my research. I had funding to try to interview precarious laborers in the Philippines who do content moderation stuff, but I couldn't get a hold of them. Obviously, they have signed non-disclosure agreements, and it's hard to gain access. At the same time that this was happening, I was thinking, maybe I should just pivot the work to talk about the other side of the coin, which is the trolls, the online workers who are responsible for misinformation, for digital campaigns that are deliberate in terms of pushing election propaganda, conspiracy theories to try and gain political currency and vote. And so that's my research, which started in the Philippines, for the 2016 Philippine elections. It takes analysis of disinformation in terms of centering the actual workers behind the scenes and how they do and also why they do the things that they do. So that's my entry point here, really understanding the political economy of disinformation.
Dr. Crockford: Yeah. So what really drew me to your work? Well, I think I read your network disinformation report first of all, and I was really struck by how you describe, as you say, like the work of creating disinformation. And now that's something that a lot of people may not realize. It's sort of a hidden form of labor behind all of the constant lies and conspiracy theories that we read on the internet. And perhaps people don't realize that it is actually being intentionally produced. So can you just give an example of a campaign and how the different people work within that campaign to create what we then read as content?
Jonathan: Absolutely, so in a way, my work not being centrally in mainstream political communication, also not in mainstream journalism studies. A lot of the work that I have noticed in the field was quite tech-centered about making critical evaluations whether something is misleading or false, outrightly, deliberately, intentionally false. Or mapping out the communication networks online, just on Twitter, right? But I needed to understand the people behind disinformation campaigns. What are their formal job titles? Did they aspire to do this kind of work like this? Is something aspirational for them? Is this an add-on job for most of them? And for me to situate it in the Philippines. This is a country that has embraced the digital and creative industries as a sunshine industry-- this is like an area of growth for our economy. So as a developing country, like we're seeking investments, and you know, digital industries are seen as an area for growth, right? And so you have a lot of upstart digital firms and entrepreneurs who sell various kinds of services to different clients-- corporate clients, but also political clients. And I find that story fascinating to tell, like the origin story of many or the most popular forms of political disinformation campaigns, they actually start from digital marketing and advertising agencies. So when I started my research, I began with interviewing the campaign managers of politicians. And I started back in 2016, so some of the main campaign managers of politicians specialize in TV, radio, and grassroots organizing the campaign trail, the physical campaign trail. And they would be like, 'Oh, the digital? Somebody else is doing that,' in a kind of low status judgment of who they are. And then I ask, 'Oh, can you introduce me to them?' And then I start talking to the digital marketer. And they discuss a very wild, wild west kind of industry that they're able to hire different kinds of digital support workers to do the social media pages of politicians, but also to grow community pages dedicated to politicians. They ask their workers to open up fake accounts. Each worker is kind of incentivized to open at least six fake accounts during an election season, and each account takes on a specific identity. It's very gendered. There's a 'bikini troll' assigned to appeal to the men and friend them on Facebook. There's also the smart sounding fake account, which takes on a male persona-- someone with glasses, and that takes on a tone, a specific tone. And then there's also a 'bitchy' account, and I was surprised to find out that it's also very gendered, like this is a gay, queer, coded account who's supposed to attack and be snarky about the other political candidates. So architects have networked disinformation. My report based on the 2016 elections, was mapping out the diverse kinds of workers, the top level strategists, influencers, but also low level fake account operators who do this job on the side from their day jobs, really.
Dr. Crockford: When looking at election disinformation, the Philippines matters because it shows us how the networks that create and spread disinformation emerged. Disinformation is not just ideologically driven, it is also a form of labor, an extension of corporate marketing campaigns. Trolls are also workers. They get candidates' votes through digital campaigns, large accounts can switch from corporate to political posting during election campaigns, and then after the campaign, return to non political posting. Some accounts are paid to post disinformation, whereas some are strategic. Influencers that can get revenue from YouTube and Facebook and X will jump on issues that will get more engagement, to get more clicks and eyeballs through engagement hacking. When actors Amber Heard and Johnny Depp went to court in 2023 over a defamation claim related to domestic violence allegations, their legal battle was accompanied by an explosion of online content, a lot of it supporting Depp and denigrating Heard.
Media Clip: Welcome to WatchMojo. And today we're counting down our picks for the top 10 moments from the Johnny Depp and Amber Heard trial that broke the internet. [I frequently bring muffins to the office.] For this list, we're looking at highlights from the ongoing Depp v. Heard trial that have sent social media into a frenzy. Which moment from the trial made your Tik Tok account explode? Let us know in the comments.
Dr. Crockford: While some of this content was organic, some was posted by established YouTube accounts switching to posting about the case because they knew videos about it would go viral. Monetization of user content means that some influencers will post just to increase engagement. For some, it means they make part of their income from spreading disinformation and extremist content. There is often a focus on ideological reasons for disinformation, but it's also a matter of political economy. Since posting isn't just for the lulz, it's also a form of labor, a way to make a living. The same logics operating behind PR campaigns work behind political campaigns, and there is no or little oversight. Political speech is protected speech. Commercials for products are regulated, but political speech is not meaning it is even more unregulated. On Facebook, as long as it is from a political campaign, ads can contain untrue statements because the platform doesn't want to censor political speech. And this isn't just a problem for the Philippines. It's happening in other places too. Similar digital campaigns took place in the Brexit referendum in the UK, the election of Narendra Modi in India, Jair Bolsonaro in Brazil, as well as the well known case of Russian influence campaigns on US elections. So what did Duterte do differently with social media, and why was it so successful?
Jonathan: What did Duterte do that was different, especially on social media? So first and foremost, Duterte invested a lot on social media because it was cheaper to launch a digital campaign in comparison to a TV, radio, you know, mainstream media campaign. So it's easier to mobilize a lot of stuff online. But what I saw on social media was that his bluster, his angry rhetoric, was easily modeled by his fans online, and I think they took forward a lot of his very violent rhetoric online in ways that maybe he didn't even anticipate to begin with, that we didn't think that social media conversations could be this toxic in the Philippines. If you have met Filipinos, we are known to be quite non confrontational, quite indirect in our communication. Cultural values of reciprocity and politeness are clear cultural expectations. But a political figure with the status of Duterte, also with a rabid fandom as Duterte and really weaponizing this anti establishment belief system that people have felt unseen by the political establishment for so long. So he weaponized their resentment, their anxiety, in similar ways that Trump did in the United States. And yeah, it unleashed a very wildly violent rhetoric on social media that people were very shocked and surprised. There's elements which are manipulated, there's elements which are directed, but a lot of it is also very organic in that his political fans were modeling his behavior too.
Dr. Crockford: What's driving the social media campaigns of authoritarian leaders is a weaponization of resentment. People model the behavior of political leaders. They are not being tricked or controlled into acting like their leaders. They see their leaders doing it, and they feel empowered to do the same. They want to behave in the same way.
Media Clip (Trump): When Mexico sends its people, they're not sending their best. They're bringing drugs. They're bringing crime. They're rapists, and some, I assume, are good people.
Dr. Crockford: People saw Trump doing and saying things that were shocking, but that reflected things they thought and felt but were afraid to express. That's why you could hear so many people saying that a good thing about Trump was that he says what he thinks. Why is that a good thing? If everything he thinks is offensive and vengeful, it's only a good thing if you agree with him. So how big a problem is election disinformation really? How much is manipulated, and how much organic? Posting content and boosting its reach is a bit like letting an angry bull loose in this wild west space. You can release it online, but you can't determine where it goes or what it does. Disinformation campaigns can be hit or miss, and once begun, they are difficult to control. The ones that get traction strike a chord with a certain audience. It may be easier to blame Russia or blame Trump or blame Facebook than to face the fact that people are willingly following these angry, divisive, strong-men politicians and mirroring their online behavior, then normalizing that behavior in real life, because that's what they want. That's what they think is right. What they think is righteous. It's how they will save America or the Philippines or India or Brazil from all of their enemies.
Media Clip (Trump): I will build a great, great wall on our southern border, and I will have Mexico pay for that wall. Mark my words.
Dr. Crockford: Enemies are scapegoats. In the Philippines, it's drug users. In the US, it's illegal migrants or the Democrats, or the deep state. Political opponents are blamed and claimed to be in cahoots with these groups that have been constructed as enemies. Then it becomes an us versus them battle, a fight for the soul of the country, an existential combat.
Media Clip (Trump): Because you'll never take back our country with weakness. You have to show strength and you have to be strong.
Dr. Crockford: Anti-establishment, anti-mainstream media, anti-science, elements of racism and xenophobia all feature in the narratives of populist strong-men. Those elements may be false or manipulated, but the themes resonate with some people. They agree.
Maria Ressa: Well, the key part is that all of us are vulnerable to it, right? And the first part is, actually, the social media companies. Our first contact with AI that has designed the very connections between us to spread lies six times faster than facts. That's an MIT study, right? So, that's the first step. The incentive structure is turned upside down. But you know, listening to you introduce this and how fragile democracy is it made me remember again. We called it in Rappler, 'death by 1000 cuts' because it's the way truth is eroded. It's the way democracy is eroded. If you-- if lies spread six times faster, those three sentences I've said over and over since 2016, when the weapon was turned against us. If you don't have facts, you can't have truth. Without truth, you can't have trust. You can't have trust without these three, right? We have no shared reality, and that's where it begins, right? Like if, if people don't believe you, if they don't believe what I say, how can you reach them?
Jonathan: We often get asked about the platform question and about our vision for platform accountability, especially when it comes to countries like the Philippines as part of the global south, right, like, what's the responsibility of big tech companies? And you know, like writing architects of network disinformation as a nuanced account of kind of more local industry dynamics actually being the root cause of the most common kinds of disinformation campaigns. In a way, it brackets out a little bit the question of platform responsibility for a moment. And let me tell you why. So there's a lot of like tech activism and platform activism right now that reframes a lot of the problems, especially in countries like the Philippines and other global south countries, as social media is at fault here for having, in some ways, opened up this wild landscape, really, making the public sphere more toxic over time. You see some tech activists even use the term brainwashing, which I'm very uncomfortable with, that somehow social media have brainwashed uneducated citizens to vote for populist presidents who are somehow so skilled on social media that they have unlocked something alchemical, magical on social media that other politicians have no access to. Or Cambridge Analytica had helped many countries around the world to unlock something weak in populations around the world, and I find that those explanations quite unsatisfying to me too. And that's why I felt that I needed to ground this research in actual production dynamics of how our campaigns are really organized and what the organizers think that they achieve. So I think platforms need to be doing well, to be more transparent, to be more accountable, to allow researchers to audit the kinds of regulation and governance that they do in terms of what content should be included, what are policies that are most helpful for identifying influencers who are always, you know, the usual culprits of disinformation. Anyway, I think they need, you know, more collaborations with civil society and researchers to audit their mechanisms. I also push back on platform determinism, the explanatory frame that somehow that platforms have been the key cause for all of these wrong things that we were seeing online and for trends of democratic backsliding or voting for populist politicians. I do think that they accentuate and accelerate certain trends, but not originate them.
Dr. Crockford: So, what should we do? Should we delete our social media accounts? Some, like tech activist Yaron Lanier, say 'yes.'
Media Clip: [Interviewer: Is there a principal reason why I should delete my social media? And if so, what is it?] There are two. One of them is for your own good, and the other is for society's good. For your own good, it's because you're being subtly manipulated by algorithms that are watching everything you do constantly and then sending you changes in your media feed, in your diet that are calculated to adjust you slightly to the liking of some unseen advertiser. And so if you get off that, you can have a chance to experience a clearer view of yourself and your life. But then the reason for society might be even more important. Society has been gradually darkened by this scheme in which everyone is under surveillance all the time, and everyone is under this mild version of behavior modification all the time. It's made people jittery and cranky. It's made teens especially depressed, which can be quite severe, but it's made our politics kind of unreal and strange where we're not sure if elections are real anymore. We're not sure how much the Russians affected Brexit. We do know that it was a crankier affair than it might have been otherwise...
Dr. Crockford: While it may be a subtle form of behavior modification, as Lanier claims, it remains deeply problematic to say it's simply social media's fault. This form of platform determinism denies human agency. People can still choose what content to pay attention to and whether to use social media at all. It is possible to simply not do social media. How disinformation affects elections is difficult to establish with any certainty. While political ads and comments and posts on social media are an influence, there are many other influences on people's voting behavior, and let's not be confused, the type of behavior modification that developers like Lanier claims social media achieves is based on an old psychological theory called behaviorism that argues that behavior is conditioned by environmental stimuli. Think Pavlov and the dogs that salivated when he rang the bell before dinner. This information does leverage tactics that have been used in psychological warfare for a long time, but these psychological tactics are not as effective as often claimed. Repeating a lie doesn't make it the truth. What convinces people more often is power rather than evidence. The more power that someone has, the more they are able to impose their version of truth on others, but power can still be subversive. Dictators can be fought. Social media can simply be turned off, or, better yet, regulated by more effective policies regarding inauthentic content, hate speech and harmful legal speech. Social media is not brainwashing. People are being appealed to the same way corporations use ads to make appeals and raise awareness so we buy their products. Donald Trump didn't brainwash millions of Americans with Twitter. There's millions of Americans who support Donald Trump. This may be difficult to accept on some level. Trump and other authoritarian strong-men like him are so different from preconceived ideas of liberal democracy. For some the easier argument is something must be to blame something other than the voters themselves. So blame social media and call us all victims of mass manipulation. But that doesn't mean social media is blameless. Tech companies pay people in the Philippines extremely low wages as moderators who scrub their platforms of gore and beheadings, moderators have to watch the very worst content with little support for their mental health. There are unjust labor practices around the world that tech companies, like many other multinational corporations, leverage to help their bottom line. They leave hate speech on platforms that could be removed, speech that often harms marginalized communities, but if it's not a big enough issue to get the company's bad press, they do little about it. Still, social media is not the sole culprit for a mass manipulation experiment. That argument can be disempowering and alienating. If you tell people they are uneducated and brainwashed to explain why they feel a certain way, how would they respond? How would you feel if someone said that to you? It can be counterproductive to say to people who are skeptical about mainstream media that they are brainwashed by digital media. Otherwise they would, of course, agree with mainstream media, academia, politicians and activists. So why is disinformation on social media so powerful?
Jonathan: Some of the most powerful examples of disinformation campaigns from my own research have been strategists who seed and insert an insidious claim on social media and see how other people take it up and, you know, riff on it and see it grow. And it's easy to see this kind of information on social media rather than on mainstream media. So they see that social media is a very fertile ground for this kind of participatory dynamic, for people to riff off of these kinds of insidious conspiratorial claims, and see how far it could go, right? And then at some point, actually, once it starts to gain traction, once it starts to gain fans and followers and some people have, like, real enthusiasm behind this, kind of, nugget of conspiratorial information, once it starts to trend, then mainstream media picks up on it. So for me, those most powerful examples of disinformation campaigns are those that are able to achieve an element of attention hacking, in the words of Alice Marwick and Rebecca Lewis, right? Like when something is seeded on social media, and later on, it becomes part of mainstream conversation as well, because also mainstream media got hoodwinked too, and they picked up on that information that was viral. They have a predilection to also report on viral content on social media, and they become complicit in spreading that conspiratorial belief. So for me, those are the most powerful examples of disinformation campaigns, and those examples have been in the Philippine context, there were examples that were insinuating a former president as an absentee president. So a hashtag that was seeded on Twitter that we traced back to a meme account, a meme page that specializes in inspirational quotes with large number of followers, that this account was actually the originator of a hashtag, which I will translate to English as, #whereisthepresident? So this account seeded this as like, hey, our president is always absent when there's a crisis, and then it trended. Every time that there's a big crisis and the President hasn't issued a statement this hashtag continued to trend. And that stuck to the mind of the Filipino public, that, 'Oh, these elite leaders, liberal presidents are unreliable during a national crisis.' But someone like Rodrigo Duterte, a populist, authoritarian, strong-man, knows what to do during a crisis situation, because he's hard on crime, he's strong on security, he's able to galvanize the forces, whatever the cost, right? As opposed to someone who's wishy-washy, who doesn't get their hands dirty. So that's a concrete example for you, Susannah.
Dr. Crockford: From inspirational quote meme page to hashtag hacking the presidential campaign. Hashtag, 'Where is the President' is like the hashtag, 'Stop the Steal.' Using a hashtag to launch a political movement, seeing what took off, what got popular, what people responded to, what got shared, reformed, retweeted, remaking it into their own content, and from that, going viral. That's how accounts do the work behind spreading disinformation on social media. So election disinformation is a serious issue, and social media platforms should do more to regulate political speech. But it's not always so straightforward as simply changing the law or tightening regulations. Who is making the law? Who determines what is acceptable political speech? Allowing most forms of speech has led to a flooded zone, but circumscribing certain types of speech means setting someone or some group of people as the arbiter of what is acceptable and what is not acceptable for everyone else. Trust and authority are slippery concepts in this contemporary moment of intersecting political crises. We live in a complex information ecosystem that is hard to navigate and constantly changing. You can't just say you're an expert and expect people to listen to you and believe you. We need to rebuild trust with each other. We need to think carefully about how to get ordinary people to be part of the solution and make the public sphere more inclusive as well as healthier and more truthful. Social media didn't cause the Jan. 6th insurrection. Enough people wanted to reject the outcome of the election because it wasn't in favor of Donald Trump. Disinformation campaigns on social media lit that tinderbox, but it didn't pack it with explosives. That's important to remember as we move into yet another contentious and divisive election season in 2024.
Media Clip (Trump): I make you this promise as your president, and nobody else can say it, I will restore peace through strength. And yes, I am the only one that will prevent world war three, because we are very close to world war three.
Dr. Crockford: Whenever I hear Donald Trump speak, I find myself filled with fear and anxiety. As in this clip, he often says things that are purposefully designed to evoke strong, negative reactions. When I hear this, I think, 'Wow, Donald Trump sounds terrifying.' But when other people hear this, they think, 'Wow, Donald Trump is speaking the truth, and he's also offering a solution: himself.' He can 'save' us, and there are valid reasons to believe the things he says. Escalating attacks between Iran and Israel, the genocide in Gaza, the Russian invasion of Ukraine, these regional conflicts could very easily spark a wider complication. It really could be world war three. And then there are the systemic problems we've highlighted throughout this series. Wages are stagnant. Rent is too high, inflation is too high. We're not supposed to use our cars anymore because of climate change, but public transport is too expensive. The grinding problems of everyday life create a sense of ongoing malaise and decline, and authoritarian populace like Donald Trump give voice to this. They validate these feelings and provide hope. They alone can fix it. But this is a line, right? A hook to reel us in. It's the same maneuver salesmen and con men have been doing for ages. Identify a person's problem and then sell them a solution. While many will see through this act, some people will fall for it, but not because they are weak. Vulnerability isn't a personal failing, it's a systemic position produced through power. People are made vulnerable through social, political and economic conditions. Too many explanations of disinformation narrowly focus on individual psychology. Why are these people believing what seem like obvious lies? I think back to what Maria Ressa said in the clip earlier in this episode. Without truth, we cannot have trust. And without trust, we have no shared reality. But whose truth were we sharing to begin with? What is the basis of our shared reality? Over the course of these seven episodes, we've discussed varied examples that show how difficult it is to pinpoint this sense of shared reality. What may seem like a sensible urban planning solution to some will be a punitive, restrictive freedom to others. Some find wellness practices beneficial. Others see them as an expensive waste of time. One person's con-man is another person's strong-man. But I don't want to leave you with such an equivocal answer. Our shared reality has always been subjective. We experience reality separately in our particular consciousnesses, but we live together, and living together means accepting that we live in a society, and so we need shared norms, regulations and limits, and there needs to be accountability to maintain those guardrails. Accountability is difficult to implement without a basic acceptance of evidence that the votes counted means that party won. And this is the problem of misinformation and disinformation. They are tools used to seize or maintain power. This is what the focus on individual psychology or the failings of social media platforms misses or minimizes. When we are talking about misinformation and disinformation, we are talking about using lies to take power, to subvert norms and limits, to avoid accountability. We cannot live with each other if that continues for long, but by understanding what misinformation is and why it matters, we can better understand how we think about the bodies we live in, the cities we inhabit, the communities we build, and the future we want. Hopefully, now you understand the mechanics of how misinformation works and spreads, so that you can better detect it when it comes across your timelines and we can all live together better. Thank you for listening. I've been Susannah Crockford. And remember misinformation matters.
Discover more shows & releases
Straight White American Jesus
The flagship show examining Christian nationalism and democracy.
Listen Now