Welcome to the Disinfo Industry

Are you contributing to a disinformation campaign without even knowing it? This week, we talk to Kate Starbird about unwillingly contributing to the spread of mis- and disinformation, Jonathan Corpus details the Filipino disinformation industry built on gig workers, and we hear from Young Mie Kim about Russian's intervention in Western politics.

Guests: 

Kate Starbird, Jonathan Corpus Ong, Young Mie Kim

Subscribe directly on Apple Podcasts or Spotify, or via our RSS feed. If you're enjoying the show, please be sure to leave the show a rating on your podcast provider of choice.

Are you contributing to a disinformation campaign without even knowing it? This week, we talk to Kate Starbird about unwillingly contributing to the spread of mis- and disinformation, Jonathan Corpus details the Filipino disinformation industry built on gig workers, and we hear from Young Mie Kim about Russian's intervention in Western politics.

In this episode Jonathan mentions his research "Architects of Networked Disinformation" and Young Mie Kim mentions "The Stealth Media? Groups and Targets Behind Divisive Issue Campaigns on Facebook." Jonathan also has his own podcast about the current elections in the Philipines.

Transcript

Jonathan Corpus Ong:

There's a lot of people who are complicit in disinformation production. It's not just a product of Duterte and his government. It's not just a product of three influencers who are queens of fake news. There's whole industries who are profiting off of this.

Introduction:

You’re listening to Viral Networks, a look at the current state of mis- and disinformation online, with the scholars studying it from the front lines. We’re your hosts, Emily Boardman Ndulue and Fernando Bermejo.

Emily Boardman Ndulue:

You just heard from Jonathan Corpus Ong, who will be joining us today to tell us all about the alliance forged by advertising firms and contract laborers to produce viral disinformation in the Philippines.

Fernando Bermejo:

Today we’re asking who produces disinformation, and why do they do it? We’ll be talking about the “A” in Camille Francois’s disinformation ABC: manipulative actors. And we won’t just look at how these actors manipulate information, but how they manipulate other people to share and even create this content themselves.

Emily Boardman Ndulue:

That’s exactly right. Kate Starbird will tell us all about how literally anybody could be swept up in a disinformation campaign. After that, we’ll hear from Jonathan Ong on how we might understand the Filipino disinformation industry as not just a political issue, but a labor issue. And finally, we’ll talk to Young Mie Kim about how political actors use online advertising to deliver coordinated disinformation campaigns.

Fernando Bermejo:

First up, Kate Starbird.

Kate Starbird:

I'm Kate Starbird. I'm an associate professor at the University of Washington in the Department of Human Centered Design and Engineering and I'm a co-founder of the UW's Center for an Informed Public.

Emily Boardman Ndulue:

Kate studies how disinformation often flows in the aftermath of crisis events, such as natural disasters or mass shootings. Understanding the tactics employed by manipulative actors is a key part of this work.

Emily Boardman Ndulue:

What do we know about the production of disinformation?

Kate Starbird:

That's such a broad question because there are so many... Just like all information has been democratized, right? So is disinformation. The tools of disinformation are now available to anybody who wants to use them. It doesn't take much to pick up some social media accounts, maybe make some friends or maybe create some bots, and then just start trying to spread information that could be used to deceive people for some purpose.

Kate Starbird:

I think there are many different kinds of production of disinformation. I've been really focused recently on participatory disinformation where people who don't understand that they're part of a disinformation campaign begin to both amplify existing disinformation and even produce disinformation themselves that aligns with some political goal or some other ideology of theirs. So we've been really thinking about that element, but there's also very coordinated disinformation where one organization will create some mechanism via maybe automated accounts or by hiring people and paying them to mass-produce information that's meant to deceive for either a political objective or a financial objective or something else.

Fernando Bermejo:

When we asked Kate to clarify if there are specific types of people who are more likely to unwittingly participate in spreading disinformation, she made it clear that it is not limited to less savvy Internet users or people with certain political beliefs.

Kate Starbird:

I would say that all of us are vulnerable to becoming part of a disinformation campaign. Certainly in the 2016 data that we had, we were looking at Black Lives Matter discourse in the United States. We were looking at pro-Black Lives Matter and anti-Black Lives Matter content. The Russian disinformation campaign for the Internet Research Agency infiltrated both sides and they were highly retweeted by political elites on the right and they were highly retweeted by folks on the left. I do believe that I either followed — well, I know that I either followed or retweeted one of their accounts in that context. Jack at Twitter, the CEO, retweeted one of the IRA accounts. So we're all vulnerable to disinformation.

I think there are asymmetries in how much disinformation has infiltrated, especially the political right in the United States, but that's happening in other places as well. Certainly, I think some people are more vulnerable than others and more likely to encounter it. But I think it's important to understand we're all vulnerable, but also that right now, there's an asymmetric use of disinformation to support the rise of authoritarian and right-wing populist movements around the world.

Emily Boardman Ndulue:

The way Kate described participatory disinformation to us made a lot of sense. Everyone online is vulnerable to it because the people pushing disinformation are trying to cast a wide net. But we felt like this all begged the question: did these campaigns work because they were highly coordinated, or because the people leading them just did such a good job at creating content that could spread naturally?

Kate Starbird:

I think there's different kinds of disinformation or perhaps even different states of a disinformation campaign. There definitely is that coordinated stuff. We saw that with the Internet Research Agency in Russia. We've seen that in other countries, in other contexts.

In the US, we've seen right-wing organizations pay trolls to spread content as well. But this is something that happens. So we've got that orchestrated or coordinated content. But then there's also the stuff that's just cultivated where some political actors may be inserting some things or amplifying things that they like or helping those things rise to the top, but they're not necessarily — they're not creating all the content. In many cases, people who are unwitting agents are creating the content themselves. Then we get to these purely organic things where the disinformation agents are either gone or they're just kind of watching. This content can be organically created by folks who have begun to believe that the meta-narratives and these ways of seeing the world are the ways that they now see the world and they can generate that content.

Fernando Bermejo:

Kate made sure to point out, though, that not all people producing and promoting disinformation are motivated by politics. Many are motivated by profit too.

Kate Starbird:

So many of the folks spreading mis- and disinformation are motivated by financial goals. We've seen that across so many different contexts, but I remember my first study of disinformation, I had just realized that that's what we were studying. We were looking at these conspiracy theories during crisis events and just began to realize that they had this element of disinformation. So many of the websites that were spreading these claims that these crisis events or these shooting events weren't happening. They said they were all being staged by crisis actors. I'm sure you've heard these claims. But we had been seeing this for several years and so I went to look at all the websites that were pushing out those false claims. So many of them had advertisements for nutritional supplements. I can't tell you how many of them. I was doing hand-coding, qualitative coding. I had a whole category for “Is it selling us nutritional supplements?” Yes.

So there's a whole natural news, these kinds of alternative health sites that become mass propagators of conspiracy theories and false information in part to drive people to their websites so they can sell them things. Often that, when it intersects with some of these pseudoscience narratives, which the COVID misinformation often does, that leads to these alternative health products and that becomes a whole industry that drives these websites. So certainly many of the folks in this space, they can get low quality content that's often no one's going to care if they share it because it's got... someone created it for a political motive or something else. So they push out this low quality content, it's easy for them to generate or copy and paste, and then they pull people into their website to see their advertisements and to sell them products. So certainly financial motives are pushing a lot of that disinformation ecosystem.

Then others like the reputational stuff. We see folks, they want to be influencers. To become an influencer, you know what? Spreading sensational conspiracy theories is one way to get a lot of attention. Being first on the virtual scene of a crisis event and then creating the political frames for that event that your side fines valuable, that gives people a lot of attention and we can see that being a strategy of many influencers, many social media all-stars or these new social media influencers, that's a strategy they use to grow their audiences. At first, it does seem like just the reputation alone is motivation enough. “Oh, look, I'm getting more followers. I'm getting more attention.” Eventually those folks leverage that for political motives or financial motives or both.

Emily Boardman Ndulue:

At this point in the interview, Kate had given us a pretty good idea of who produces disinformation and why, but we were still curious about the her concept of of participatory disinformation. We wanted to know exactly how it works, and Kate offered a salient example from the 2020 election.

Kate Starbird:

So one of the examples we have is something we called “SharpieGate,” where people went to vote and they were given Sharpie pens to fill out their votes in-person on Election Day and the Sharpie pens bled through the ballots. The ballots were actually designed that way and it was not going to affect whether or not they were counted, but it made people concerned. After they handed in their ballot, they went home and they convinced themselves online that their ballots hadn't been counted. Then they went to this tool that gave them the status of their ballot and it said “canceled.” Except the tool said the status of their mail-in ballot, which was canceled when they voted in-person. But they didn't see that. You can see it if you look. It was right there in front of them. They didn't notice.

But they misinterpreted the evidence. They misinterpreted their own experiences through this frame. They were expecting voter fraud and then they became convinced that they had been disenfranchised and that they had been part of this voter fraud to disenfranchise specifically Trump voters because that's what they were expecting, that's who they were, and that's how they saw the world. So we can see that becomes... And it becomes promoted. It becomes part of the “Stop the Steal” narrative. It's drawn from SharpieGate. It's one of many narratives that fed into it. It ends up motivating the violence on January 6th. So we can see that participatory and almost organic nature of disinformation happening in that case. It's not purely organic because the elites amplified. They found that evidence, they amplified, and they organized around it. So it's this intersection between these participatory audiences, these unwitting crowds, and these political elites that understand the keys to their power is in working with these crowds to help perpetuate their narratives.

So if it's so easy to convince ourselves that we can't trust the results of a democratic election, I don't know what the future holds for democracy.

Emily Boardman Ndulue:

I've been thinking on the call about how you have such a unique long view of this problem, working deeply in this for over 10 years now. It does sound like even just in the example that you gave between the two elections that we're in a much worse place now with disinformation than in 2016. I'm just trying to understand how the problem has amplified over time and what you would say about that.

Kate Starbird:

I do think that we're in a little bit better position right now just because of the collective awareness, but I'm not convinced that's going to be enough because I think the challenge is really steep on how the ways that we inform ourselves about the world have been and are being profoundly changed. That is going to reconfigure how people think about each other, how we think about our governments, how we think about the collective challenges we face as a globe. All of these things are experiencing rapid change.

Emily Boardman Ndulue:

I really like Kate’s last point, that there are a lot of collective challenges we face around the globe — and disinformation is just one of them. As we’ll hear from our next guest, those questions about how we understand ourselves and each other, how we think about our governments — those are relevant to disinformation outside of the US context too.

Fernando Bermejo:

Jonathan Corpus Ong joins us next to tell us about his work studying disinformation in his home country, the Philippines. As digital work proliferates there, so does a cottage industry of advertising firms and contract workers manipulating conversations on social media.

Jonathan Corpus Ong:

I'm Jonathan Corpus Ong. I'm an associate professor of communication at UMass Amherst. And I'm also a research fellow at Harvard Kennedy School and the Shorenstein Center's TASC project.

The Philippines is a social media capital of the world. The Philippines has been ranked in global surveys as having the most active users on social media. I believe up to five hours per day is the average time, average Filipino citizen, and that's an average time that they spend on Facebook and other social media.

So the intensity of activity in social media is very much there. And therefore our digital industries, our creative industries are very well-versed in campaigning the different kinds of digital operations, from search engine optimization to website development, to data analytics, and they're able to offer their services to regional clients.

Fernando Bermejo:

Jonathon told us that his work on these topics began in 2015 or 2016, studying the condition of digital labor in his home country, the Philippines. He was motivated largely by the working conditions of the laborers that American social media companies hired to moderate content on their platforms. Often, these Filipino laborers were left to deal with the most disturbing content, getting paid significantly less than what these American companies would have to pay American workers.

Jonathan Corpus Ong:

And the story at the time was really more on the Philippines as a social media content moderation capital of the world. So along with India. So the Philippines is known for having these content moderators in these outsourced work arrangements, being the digital janitor, scrubbing the filth off of Facebook. And so they're exposed to violence, gore, dick pics, et cetera, and having to endure really difficult work conditions.

And at the same time, of course, as the story was developing, we had all the discussion about what social media might be doing to politics in the form of troll farms, paid troll workers for elections, Trump, Duterte, et cetera. And so I thought it was just a really interesting way of interconnecting two different research interests of mine, politics and also digital workers. So you have this story of on one hand, these guys have call center work arrangements of the content moderator scrubbing filth off of Facebook. While on the other hand they could be sitting right next to trolls, paid trolls, actually producing all of that filth for politicians in the Philippines and maybe even for overseas clients. So I found that important to talk about from a very human-centered perspective.

Fernando Bermejo:

As an ethnographer, Jonathan dove into this production of viral content in the Philippines. What he discovered surprised him — something that very closely resembles what Kate Starbird calls participatory disinformation.

Jonathan Corpus Ong:

My work on disinformation architects, "Architects of Networked Disinformation", our 2018 report was really based on interviews with campaign strategists, influencers, fake account operators, and also digital ethnography. Seeing how what they told us in the interview actually played out in our own analysis of how certain campaigns ran, how certain hashtags were actually instantiated in social media.

So there was one hashtag called #NasaanAngPangulo, which translates to hashtag #WhereIsThePresident, which a campaigner in an interview told us that that was her idea. So she took credit for it in the context of an interview. "Oh, remember that campaign? I was the mastermind behind it." It's like, Oh my God, we had no idea. We thought it was purely organic. It was a hashtag campaign that tried to seed doubt in the former president and trying to seed doubt in his ability to lead a crisis, because he was nowhere to be seen. And it kind of set the stage for a strongman president to come in, like Duterte. He's not like the absentee president.

And so we looked at the origins who were the most active posters and which were the most active influencers advancing that hashtag. And we confirmed it, that it was a set of parody accounts. So meme pages that were really intentionally driving that conversation, but was taken forward by ordinary people, also by mainstream media that covered that hashtag.

Emily Boardman Ndulue:

Are these jobs in the sense of ongoing engagements for pay? Is this someone's way to provide for their families that they're being paid to do this?

Jonathan Corpus Ong:

Disinfo for hire, right? I think in the Philippines one needs to understand the troll work within the broader context of our economy and how, first of all, creative industries and digital work – these have been also embraced by the state as sunshine industries, right? And so there's an official government discourse to embrace digital work. And even those business process outsourced firms, and they're celebrated as good for the economy. And for you to participate in that, you're being in a way kind of like a hero if you're a part of the digital workforce. It's something that can progress the country toward new heights.

We see in our work that nobody was really hired to become a full-time troll. This is a project-based arrangement, sometimes three months, sometimes a six months short-term project. And you kind of juggle your consultancy for a politician or to advance a particular narrative, a political narrative alongside other kinds of gigs that you have.

And so it is seen as something that, yeah, it can help sustain their family. But that's not the only kind of arrangement.

Emily Boardman Ndulue:

And is political, advancing political candidates or pulling down others, the main way that these sorts of jobs are engaged?  Or is it also, for example, is health coming in there, commercial interests? What is sort of the subject scope of this type of work?

Jonathan Corpus Ong:

What interested me the most and what I saw as driving most of the disinformation narratives is that it's from an advertising and public relations kind of model where it's a politician enlisting a brand executive to assemble teams that can improve their image, but also smear their rival candidates. So it's what I call the advertising and PR disinformation work model. And that's what's most predominant in terms of how the most dominant disinformation narratives and most insidious influence operations are strategized and coordinated.

And so, you ask whether is it mostly political? Actually, what we found in our work is that it was mostly corporate. Or at least that's how it started.

So before they even meet a politician the advertising and PR executives would be very boastful about and would be citing all of their experience in the corporate marketing world. So the question in my interviews when I asked head and PR executives, I asked them, "So how do you pitch your services to a political client? What does that first meeting sound like?" And they'll often say, "Oh, I cite my portfolio, how I had done this for soft drink brands, for shampoo brands. I was able to have this hashtag trend worldwide for a shampoo brand and they claim credit for that." And of course it lends an aura of professional respectability.

For you to be associated with a Coca Cola, or a Johnson & Johnson, and therefore a politician will be enticed, "Oh, this person seems to what they're doing." And also it doesn't feel like they're going into a very shady transaction, if you know what I mean

Emily Boardman Ndulue:

Jonathan told us that while Filipinos are aware of these disinformation efforts thanks to robust independent journalism, news coverage doesn’t often reveal the full scale of this disinformation ecosystem. Most coverage focused on a few especially notable bad actors who spread disinformation, including one figure the Filipino press dubbed “the queen of fake news.” Jonathan thinks these notable figures are just one part of a bigger problem.

Jonathan Corpus Ong:

there's a lot of people who are complicit in disinformation production. It's not just a product of Duterte and his government. It's not just a product of three influencers who are queens of fake news. There's whole industries who are profiting off of this.

It's not as simple as deplatforming, erasing, canceling one person, one personality and everything will be better. But it's really to understand, "Oh, there's so many layers here of exploitation, self-exploitation, also aspiration." They're just doing it to get by, to provide for their family. But what is clear in my work is that I assign responsibility to the politician and to the mastermind of the campaign. And how those people at the very top, we need to have a much stronger discourse of accountability to hold those people into account.

Fernando Bermejo:

So Kate Starbird helped us understand how disinformation is often produced and spread by people who don’t even realize that’s what they’re doing, and Jonathan just told us about how an actual industry has cropped up in the Philippines to manufacture such content. For the most part what they’re talking about is viral content — memes, videos, articles, hashtags. But that’s only part of the story.

Emily Boardman Ndulue:

That’s right, Fernando. So far we’ve avoided talking about paid advertising, which is a really significant vector for disinformation online. We invited Young Mie Kim from the University of Wisconsin Madison to tell us a bit about her research studying political ads online, and how she learned that many of them are the result of coordinated disinformation campaigns.

Young Mie Kim:

Hi I’m Young Mie Kim, a professor at the School of Journalism and Mass Communications and faculty affiliate in the department of political science at the University of Wisconsin Madison. I study political communications, so focusing on the role that digital media plays in communications among political actors, like agents, political leaders, and advocacy groups.

Emily Boardman Ndulue:

To study paid ads leading up to the 2016 U.S. General Election, Young Mie and her research team at the University of Wisconsin–Madison set out to reverse engineer who were the sponsors paying for the ads and what kind of individuals were targeted by these ads.

Fernando Bermejo:

The research team developed an app to track political ads that users saw around the web, including on social media. About 10,000 people volunteered to take part in the study, which let Young Mie and her team sample 50,000 ads focused on specific policy issues such as abortion, LGBTQ rights, gun ownership, immigration, race, and so on.

Young Mie Kim:

The big discovery of this project is that we found many unidentifiable, untrackable, unattributable groups, namely suspicious groups. We found that more than half of the sponsors of the Facebook ads turned out to be a suspicious group. And then later, one out of six of these suspicious groups turns out to be Russian groups. Those are run by Internet Research Agency, information operatives linked to Putin. 10:12

Fernando Bermejo:

Young Mie’s team came to believe that these ads were being funded by what many in our field call “astroturf campaigns.” These are efforts which are funded by corporate or political players masquerading as grassroots organizations.

Young Mie Kim:

Participants believe that these are true grassroots, but some of the members of the group could just be planted by the organizers. It gives like a false sense of community but also a false sense of public opinion.

Emily Boardman Ndulue:

For her research purposes, it was always suspicious if an organization purchasing an ad on a social media platform wasn’t registered with the Federal Election Commission or the Internal Revenue Service. If that group was domestic, then they were certainly breaking the law. But there is also a good possibility that those groups were actually foreign actors who would not be subject to United States elections and tax laws. In her eyes, either of these are good indications of astroturf efforts.

Fernando Bermejo:

Those government agencies regulate political advertising during elections, so ads on social media are a logical focus for Young Mie’s research.

Young Mie Kim:

We believe that advertising analysis is a great starting point to detect coordination because advertising is an outcome of the deliberate strategies, particularly the targeting and messaging strategies. So by analyzing the targeted messaging and their connections, we can come to one step closer to understanding the underlying strategies.

For example, we tracked back all those suspicious actors’ ad code, so any website domains’, like domain registration information, things like that so we can see whether there is an overlap in ownership.

Emily Boardman Ndulue:

Once Young Mie and her team have identified that suspicious code, they can begin to understand how those actors are leading their campaigns. Here, she’s looking at something we’ll talk about more in episode five: coordination.

Young Mie Kim:

We also analyzed synchronized messaging, whether they shared the same message and then linked to each other. So it’s not like just like they’re sharing the messages. But Group A refers to Group B, then we consider that there’s a direct link, or whether they have the exact same contact information. Based on this direct evidence, we identified more than 30% of these suspicious groups are directly connected to each other.

So we analyzed nearly 500 groups, like suspicious groups, and directly linked nonprofits or unregistered groups and see whether there are some direct relationships so that we can remove indirect relationships or weak relationships and then only identify the groups that have a direct relationship.

So our conclusion is that what we found about suspicious groups are not random groups. They are highly coordinated and probably coordinated by one campaign, and this campaign is also related to Russian groups because some of these groups are directly linked to Russia.

Emily Boardman Ndulue:

At the time we’re recording this episode, Young Mie is at work applying similar methods described here to research the 2020 US presidential election. But an initial analysis of 32 IRA-like Instagram accounts dating to September 2019 suggested similar behavior as what she found in her 2016 study.

Well that wraps up this episode of Viral Networks. In our next episode we’ll be talking more about the how of disinformation campaigns — coordination, inauthentic behavior, and other techniques actors use to spread bad information online.

Credits:

Viral Networks is a production of Media Ecosystems Analysis Group. We’re your hosts Emily Boardman Ndule and Fernando Bermejo. All episodes are produced and edited by Mike Sugarman. Julia Hong joined us as a script writer and provided additional research. Music on this show was composed by Nil and our producer Mike. Funding to produce this series was provided by the Bill and Melinda Gates Foundation. And last but certainly not least, we want to give a big thank you to all of the experts who joined us for interviews on this show.