Michael Podolsky
Michael Podolsky
CEO and Co-Founder of PissedConsumer.com

For the past decade, the digital world has almost merged with reality. Community involvement in all crucial processes is increasing. People got used to expressing their opinions, sharing thoughts, and checking the news online. Can we trust everything we see and read on social media? 

You don't believe something just because you heard it once from one source…

PissedConsumer interviewed Darren Linvill, an Associate Professor with Clemson University Media Forensics Hub, who studies social media disinformation. In this video, Prof. Linvill explains how propaganda, misinformation, and trolls work on social media, and why you should fact-check the information from the internet.

Points discussed in this video interview with a disinformation expert:

Introduction

Michael: We have Darren Linvill with us, Associate Professor at Clemson University Media Forensic Hub. Darren specializes in social media, disinformation, and misinformation. Introduce yourself, Darren, please, to all our readers.

Prof. Linvill: As you said, I'm an Associate Professor at Clemson. I'm in the Department of Communication. I'm also a lead researcher in Clemson University's Media Forensics Hub. I've been studying social media for close to 15 years now. For the past five years, I’ve been specifically exploring disinformation, misinformation, and the tactics and strategies of state actors in particular.

How Does Russia Spread the Disinformation?

Michael: Let’s talk about political disinformation and political propaganda. What are the strategies that companies and state actors utilize to disinform people about things?

Prof. Linvill: Sure. It can take a lot of forms, and it depends on the particular actor and their goals. Not all strategies work, depending on the context. Let me answer that question by juxtaposing two different actors in this space, the Russians and the Chinese, and what we often see from each of these groups.

Russia engages in disinformation in a very artisanal way. They will create accounts from the ground up, starting from zero followers on a particular social media platform. Then they will integrate that account into a particular online community, purporting to be a member of it, and gain followers slowly over time.

As they do that, they'll try to pull that community slowly in a particular direction. Historically, they've done this in English language conversations here in the United States, famously around the 2016 election. Here at Clemson, we were identifying accounts communicating in English that we attributed very likely to the Russian Internet Research Agency in as recently as 2020.

We know that the Russians mostly do this in Russian. They create that purport to be Russian nationalist, pro-Putin accounts and integrate themselves into those communities online. Then they pull those communities in a particular direction. 

They’re pushing particular narratives and conversations that the community they're communicating to might already be inclined to believe.

This is what we normally think of when we think of a classic Russian troll. What classic Russian trolls don't do is they don't go out there looking for a fight. 

Mostly, they're communicating to people who agree with the things that they're saying because that's how persuasion works.

You don't go out there and persuade a Clinton voter to become a Trump voter or vice versa. You persuade someone who's already inclined to Trump to actually show up and vote for Trump or someone who's inclined to vote for Clinton to show up and vote for Clinton. Or you pull that community in a particular direction, around a particular issue that they might have already been thinking about, but you want to make them a little more extreme in that direction.

What Are Disinformation Strategies Used by China?

Prof. Linvill: I want to juxtapose what the Russians have often done in the past and continue to do with what the Chinese often do. The Chinese disinformation on social media oftentimes functions in a very different way. The Russians are very interested in what you think of your neighbor. They're very interested in pushing particular narratives.

The Chinese are much more interested in simply what you think of China. They're not necessarily engaging in American or EU politics. They care about things about China.

Conversations around Uyghur atrocities or conversations around the recent Olympics. Those are the conversations that the Chinese push. The way they function is very different. They're not creating fake accounts that purport to be part of a particular community, but they operate in mass, thousands of accounts that don't even try very hard to look authentic. 

What they do is to make sure that certain conversations don't happen.

They'll do things like taking over a hashtag. Around the Olympics, for instance, we saw Chinese accounts using #GenocideGames on Twitter. Thousands and thousands of these Chinese accounts posted this hashtag over and over again, and that might seem counterintuitive. 

Why would the Chinese want to use this #GenocideGames? A hashtag that had been used by a lot of people that were critical of China in their treatment of the Uyghur Muslim minority in the Xinjiang region of China.

They were using that hashtag to connect the Olympics to Uyghur atrocities. It's a nicely packaged hashtag. It's got a literation, Genocide Games, and it certainly wasn't a hashtag that the Chinese wanted anybody using or a hashtag they wanted trending on Twitter or any other platform.

The reason they used it is they took that hashtag and they attached it to a bunch of unrelated tweets so that anyone looking to engage in the conversations using that hashtag would come across unrelated content. Content that was pro-China and was part of another conversation. That affected people's ability to use that hashtag to criticize China. 

You could see it in the data. People were starting to use that hashtag to criticize China, and the Chinese swarmed it. This is a tactic that has been used long before the Chinese by genuine activists as well, taking over, and brigading a hashtag like that.

The Russians want specific conversations to happen. There are certain tactics you would use to make conversations happen, whereas the Chinese want to make sure the conversations critical of China don't happen. There are other tactics you would use to make sure the conversations don't happen.

Can a Company Have Only Positive Reviews Online?

Michael: PissedConsumer is all about consumer reviews. The website is about 16 years old. That's when customer reviews started online, and companies began to engage in disinformation by buying reviews. You probably have heard about such a practice. Is it possible for a company to have only positive reviews online?

Prof. Linvill: That's an excellent question. I think it depends on your sample size. If a company only has three reviews and they're all positive, sure. I would believe that. But if they have 100,000 reviews and they're all positive, that starts to get questionable. The question's a matter of numbers.

Michael: Have you ever researched the amount of negative content versus positive content on a corporate level?

Prof. Linvill: With Clemson's marketing department we're hoping to find somebody that can do that type of research because we're very interested in that. I think that's an important space for disinformation that isn't talked about enough. Especially in terms of spaces that affect people's everyday lives and spaces where there's a lot of money.

Fake reviews are big business in the same way that fake social media profiles are. There are huge centers in south Asia that specialize in that type of work. We've seen those centers spread social media disinformation, but of course they spread fake reviews as well. So no, we haven't done that work yet, but it's a direction that I'm hoping to go in the future because I think it's important.

How to Fact-Check the Information Online?

Michael: How do you fact-check information online? It would be good to understand it from a corporate level, company to consumer, B2C.

Prof. Linvill: I think that when looking for factual information, it's important to understand the concept of authenticity, the process through which information came to you. Did the person that's sharing this information have some kind of agenda? Who are they? Do you know them? Do you have a personal relationship with this individual?

In general, when people are wondering how to be safe on the internet in the digital age, in an internet environment, I usually tell them that they need to treat the digital world more like the real world. In the real world, people understand that when you go outside, most strangers don't want to hurt you. Most strangers are perfectly nice. In most circumstances, maybe you become friends, who knows? 

People still treat strangers like strangers. You don't walk outside and trust every single person you meet. You don't walk outside and hand over all the contact information in your phone to someone you're walking by. You don't invite somebody into your home simply because they're wearing a T-shirt that you like, but in the digital world for some reason, we do that every single day.

We share our followers with others. We invite people onto our platform and allow them to follow us. 

We engage in conversations with people that we know absolutely nothing about and don't even know if the information that they are telling us is truthful if it's even a real person at all.

I think it's fundamentally important to treat the digital world like the real world because… 

…most people in the digital world are real and they mean no harm, but sometimes they do.

Sometimes they're trying to steal your information, steal your money, whatever it may be. They're trying to persuade you of something for some agenda that they may have. They're trying to spread disinformation. 

You have to treat the digital world with just a little bit of skepticism and understand the space that you're in. I think a lot of younger folks, and younger adults have started to understand that intuitively, but for people that didn't grow up with the internet, it's a hard concept to get.

Michael: When we receive information on Twitter or Facebook, we are saying, "Okay, don't share a lot of personal information online with people you don't know." On the other hand, if there is a tweet that flew from nowhere, you need to fact-check it.

It's a little bit difficult to go back to that person and start asking them questions because they may not be willing to share more information.

So how do you learn more about the person when you are not prepared to give out extra information about yourself?

Prof. Linvill: I think that it's important to look at the information they're sharing with you as well. When I'm engaging in the digital world, there are certain sources that I know I trust, and I try to stick to them. Still, there are a lot of perfectly reputable sources out there that I've never heard of. I'm not perfect. If I'm getting information from a link that I don't recognize or that I've never heard of, I'm still going to go look for that same information elsewhere. 

You don't believe something just because you heard it once from one source… 

…and you want to go double-check that, especially before you share it with others and you use your own credibility to give credibility to somebody else.

I think that's key because that's what a lot of bad actors are trying to get. They're trying to use your credibility to spread that information that they're trying to spread or to spread that link that they're trying to get other people to click on.

What’s the Disinformation Around COVID-19?

Michael: We just came out of a two-year lockdown for COVID. Many people were saying, "Hey, we need vaccines." Other people were vaccine-deniers who were saying, "My body is my body. I don't want to stick anything into it." 

They have the right not to take it. They have the right to speak. They had an ability to talk online that they didn't like lockdowns and masks. Yet we know a lot of social media platforms have shut down COVID-19-deniers. 

What do you think happened there? Were there political actors behind it? What's your take on that full story?

Prof. Linvill: For the past couple of years, everyone has been talking about the pandemic in COVID-19. Anytime you're talking about what the whole world is having the conversation, you have to assume that there's a lot going on and there are a lot of different actors that have different motivations.

It's true that, for instance, Russia spread some disinformation about the pandemic, especially about some of the other vaccines.

They were trying to pump up their own vaccine and make it look more reputable. They weren't necessarily trying to get people not to take any vaccine, but Russia was about their vaccine.

Other actors were spreading disinformation in order to make a buck. Take Alex Jones, for instance. He was very famously spreading disinformation about COVID-19 because he was trying to sell products off of his website. He had some products that he claimed would have an effect on curing the virus, which was all hogwash, but he wanted to make a buck. There were a lot of other individuals just like Alex Jones, who were trying to take advantage of various conversations around the pandemic just to make money.

There have been shysters in the world since well before the internet. The internet just helps them reach a much wider audience. Honestly, the biggest problem we had with misinformation around COVID-19, and continue to have, is just that there are a lot of people having these conversations and they're scared and nervous for any number of reasons.

This is especially true early in conversations around COVID-19 when there wasn't a lot of information we could rely on because it hadn't been vetted yet. We didn't know much about the virus. People were making things up or previous false stories would emerge and then be applied to COVID-19.

Early in the first two months of the pandemic, I wanted to see who was the first person that connected 5G to COVID-19. I don't know if you remember, but there were stories circulating that 5G caused COVID-19. Previously, 4G had been blamed for all sorts of illnesses and diseases from cancer to Ebola at various times. We always distrust new technologies and it's been true of 5G. 

These stories about 5G causing COVID-19 were emerging, and I looked who was the first person on Twitter that connected the idea of 5G and COVID-19. I found it was an individual in New Zealand. All their tweets said was, "And 5G causes COVID-19 in five, four, three, two."

It was a joke. They were saying, "This is a story that's about to happen. It's hogwash, but somebody's going to say it." Sure enough, the very next day on Twitter, somebody was saying 5G causes COVID-19, and these conversations were happening in Italy. So the Internet's a global place. All these conversations are interconnected.

Some of the false stories about COVID-19 you could predict would happen because it's the same sort of thing that's happened in the past and will happen in the future. We've always liked to accuse the elite and the powerful of being responsible for horrible sins. Of course, we're going to accuse them of being responsible for COVID-19. 

We've always wanted to accuse countries we don't trust of being responsible for things. 

Of course, we're going to accuse China of horrible things related to COVID-19. It's especially difficult that these stories emerge when there might be an element of truth to them. There is an element of truth to stories about China's responsibility for COVID-19, but then those little nuggets turn into snowballs, into something that is disinformation on its own.

Is Russian Propaganda Around Ukraine-Russia War Effective?

Michael: Let's talk about the Russia-Ukraine conflict. Russia has been running a hybrid war for the past eight years since 2014. How effective is Russian propaganda disinformation in Ukraine and the world overall? What's your opinion?

Prof. Linvill: I think that globally, there's an assumption that Ukraine is winning the global information war between Ukraine and Russia. To a large degree, that's probably true if you're looking at conversations in English or conversations in the West. But Putin's main audience has always been and continues to be the Russian people.

It's the Russian people that allow him to stay in power and their loyalty allows him to maintain his authority and power. 

It is true that the Russian people have always been his main target of disinformation. That disinformation has taken all kinds of forms in the past 20 years, including disinformation that comes through traditional media. There are various state media outlets in Russia, but also new media and social media, and various websites. 

In the Russian language, Russia does seem to be doing much better if not winning the information war. We worked with ProPublica near the start of the invasion to identify accounts across a number of different social media platforms: Twitter, Instagram, TikTok, VK, which is Russian Facebook, and Telegram, especially. We attributed those accounts as very likely coming from the Internet Research Agency in St. Petersburg, Russia, which is famously responsible for intervening in 2016 US election.

Some of what they were doing with these accounts was very effective, especially on TikTok. They had hundreds of thousands of followers on TikTok, and millions of likes from these accounts. They seemed to have a real effect on some of these conversations happening in Russian. 

It's certainly true that Putin has still maintained high levels of positive responses to feedback from the Russian people. There are inklings that maybe there are some chinks in his armor, but he still has maintained that support.

Michael: Social media is global, right? I would agree with you that within the Russian Federation, there is overwhelming support for Putin, just because they don't see a lot of other information. 

They lost access to Instagram and Facebook. Their access to information on social media has been limited by the government. But there are other Russian-speaking people around the world. 

The person exposed to pure Russian Federation propaganda, and the person exposed to English-speaking news channels: who is winning?

Prof. Linvill: I think that because of the particular types of tactics that Putin uses, the ball is still in Putin's court and that's because he doesn't necessarily have to win anything. He has to give enough doubt. 

For decades, Russian disinformation has centered on this idea of doubt in mainstream sources and doubt that there is any real objective truth.

One tactic that we found them using quite effectively early in the war was just undermining Western sources of information by suggesting that these Western sources were lying. One video clip we saw was a video of a German journalist standing in front of a field of body bags. The clip said that this is a German journalist standing in Kyiv reporting on the conflict in Ukraine. 

Then one of the body bags starts moving around and sits up and has a cigarette. The video clip wasn't in Kyiv at all, it was in Berlin in 2015, and this video was of a global climate change protest.

That video got a lot of traction, given a different context in conversations happening around Ukraine. What that video was designed to do was to spread doubt, to say, "Yeah, maybe the Russians lie, but so do the Ukrainians, and so does the West." 

We saw a number of fake FactCheck videos. These videos were purporting to be videos that the Ukrainians were spreading of destroyed Russian vehicles and the Russians had created the whole thing in order to fact-check them spreading additional doubt. 

I know that's complex, but the real issue here is grounded in just spreading distrust because…

…even if you don't think Putin is great, it's good enough for Putin if you just think, "Fine, Putin's not great, but nobody else is any better."

As long as the grass isn't greener on the other side, Putin's still winning. Because if you don't believe in anything, then you're not going to fight for anything. All Putin needs at the end of the day is for the Russian people to be not willing to fight for anything that's not Putin.

Michael: How do you counteract it?

Prof. Linvill: It's very difficult in autocratic countries that maintain control of their information systems. In a lot of these countries, especially Russia, you see slow change happening in the younger generations, especially the younger urban generations that have access to Western media and to information that's not Russian state propaganda.

I think that's what's going to change. It's going to be a generational shift. It's not something that you can necessarily do overnight, like invading Russia and instituting a regime change. This is probably not currently in our best interest, as long as the Russians maintain a nuclear arsenal.

The same is true in other countries as well. China maintains incredible control over its information systems, but young Chinese still manage to jump the firewall and get information from the outside. These are comparatively small numbers, but it still happens. The same is true in most autocratic countries, Iran as well.

How to Avoid Media Disinformation?

Michael: What is the craziest fake news you've ever come across?

Prof. Linvill: There are so many bizarre things out there. Probably, the craziest big news is one that is more commonly believed than many others, and that's the QAnon conspiracy theory. You have to look under the surface at what QAnon purports to be about. The QAnon conspiracy theory says that a cabal of pedophile cannibal socialists is in control of the world governments and that the only individual that can stop these people is Donald Trump.

Somehow for the past generation or more, cannibal pedophiles, are gaining control of our government, which is not. If you start looking at all the shades of various things that QAnon claims, some of them are true. But again, like with conspiracy theories related to China and COVID-19… 

…these things become easy to believe when there is just a chink of truth to them. 

And you know what? There are some really dirty people in the government, and there are sex scandals related to both liberal and conservative elites. When you take just a nugget of that and blow it up into a much bigger story, it can become something that millions of people believe to be true.

Michael: Let’s give a final message to our readers in regards to social media, and consumer reviews. How would you summarize our conversation today?

Prof. Linvill: Find sources that you trust and go to them, but then go beyond that. Find some more sources and make sure that your sources are always triangulating. Maybe even look at a few sources that you wouldn't necessarily trust, something from the other side of the ideological aisle, and take those viewpoints into account as well. 

The main thing is to have a healthy skepticism that it's still okay to trust. You can't disbelieve everything. We're still living in objective reality.

People spread disinformation about companies they use, COVID-19, or politics for different reasons. Russia and China use social media as a source of propaganda, and business owners use review platforms to boost their company's popularity. Every person must be able to distinguish disinformation.

Michael: Thank you!

How do you fact-check information found online? Please share your tips in the comments below. Don’t forget to subscribe to our YouTube channel to follow updates on experts’ videos and consumer video reviews.

  • consumer reviews
  • disinformation tips
  • expert interview
  • fake news
  • false information
  • propaganda
  • social media
  • social media disinformation
  • trolls

Legal disclaimers:

  1. While every effort has been made to ensure the accuracy of this publication, it is not intended to provide any legal, medical, accounting, investment or any other professional advice as individual cases may vary and should be discussed with a corresponding expert and/or an attorney.
  2. All or some image copyright belongs to the original owner(s). No copyright infringement intended.

Leave a Reply