Support

All rights reserved:

© Громадське Телебачення, 2013-2025.

Fake News Becomes Roboticized, Researchers Warn

Fake News Becomes Roboticized, Researchers Warn

Hromadske’s Angelina Kariakina spoke to Monica Kaminska, a cyber security expert from the Oxford Internet Institute on May 21, 2017 in Kyiv.

What You Need To Know:

✅ With the growing number of cyber threats and attacks, people are becoming increasingly concerned about cyber security;

✅ “We use junk news… [It] really encapsulates that hyper-partisan, very propagandist, overly emotional content that we are really researching,” – Monica Kaminska, a cyber security expert from the Oxford Internet Institute;

✅ “We’re interested in how public opinion is manipulated over social media,” – Monica Kaminska;

✅ Kaminska and her team also research botnets, also referred to as network robots, responsible for spamming and spreading disinformation. The team is mostly concerned with political bots, which operate over social media platforms to influence public opinion: “They might be interacting with people, they might be retweeting content.”

With the growing number of cyber threats and attacks, people are becoming increasingly concerned about cyber security. Monica Kaminska, a cyber security expert from the Oxford Internet Institute spoke to Hromadske about her research, the findings and how to make the public more aware of this phenomenon of misinformation.

According to Kaminska, the term fake news has become a catch-all term, and avoids using it. “We use junk news… [It] really encapsulates that hyper-partisan, very propagandist, overly emotional content that we are really researching.” She adds that fake news is not always completely untrue, but just very biased.

Propaganda is of a particular interest to Kaminska and her colleagues: “We’re interested in how public opinion is manipulated over social media.” The expert says that this spread of misinformation can influence elections. Her research on this topic in the US, French and German elections reveals the following: “In the US state of Michigan, a real battleground state during the election, what we found was that junk news and real news, or legitimate news from professional news organizations, were shared at a ratio of one to one.” The percentage of junk news was quite lower in the French elections, 5-7%, and 10% in Germany.

Kaminska and her team also research botnets, also referred to as network robots, responsible for spamming and spreading disinformation, for a variety of motivations. The team is mostly concerned with political bots, which operate over social media platforms to influence public opinion: “They might be interacting with people, they might be retweeting content.” And the more something is retweeted, the more it’s liked, the more legitimate it seems, says Kaminska.

The cyber expert says that a solution to make people aware of this misinformation: “The more conscious people become of these things being shared and how to verify whether something is legitimate or not, the less of a problem it will be.”

Hromadske’s Angelina Kariakina spoke to Monica Kaminska, a cyber security expert from the Oxford Internet Institute on May 21, 2017 in Kyiv.

So you are using this term, computational propaganda, what is it about? Is that already a science term?

Monica Kaminska: So the term computational propaganda, the way we use it is basically when algorithms and software meet social media. So what computational propaganda does is it distributes misinformation across social media, that’s how we understand it.

How do you distinguish between propaganda, fake news, junk news, what’s the difference there?

Monica Kaminska: So we don’t use the term fake news really, we use junk news. Fake news has become really a catch-all term for lots of different things. It’s even been used by politicians to discredit legitimate journalism, but critical journalism. Junk news I think really encapsulates that hyper-partisan, very propagandist, overly emotional content that we are really researching. So we go with junk news. Also because a lot of junk news we’ve found isn’t necessarily completely untrue, it’s just very, very biased.

Photo Credit: Angelina Kariakina and Monica Kaminska/Hromadske

Why exactly propaganda?

Monica Kaminska: So we’re interested in how public opinion is manipulated over social media. So how propaganda is distributed over social media to influence public opinion and then ultimately even perhaps influence elections.

So what are exactly your findings? What campaigns and what elections have you been researching and what did you get to know?

Monica Kaminska: The ones we have produced was one about the US election, we did three on the US election actually so that was quite a lot of reading there. Then we did two on the French election, round one and round two of the French election, and we also did the German Presidential elections.

In the US state of Michigan, a real battleground state during the election, what we found was that junk news and real news, or legitimate news from professional news organizations, were shared at a ratio of one to one. So people were basically sharing just as much junk news as they were news from professional news organizations.

You were also researching the French elections, what’s interesting there?

Monica Kaminska: We found that there was a lot more traffic about Macron than there was about Le Pen, which was an interesting finding. Actually the percentage of junk news during the French election was actually quite low. I believe it was at around 5-7%, which compares obviously in Michigan it was almost 25% of the content shared was junk news, and in Germany it was something around 10%.

Photo Credit: Monica Kaminska/Hromadske

Is there a clear understanding in your group how actually these botnets work? What’s the algorithm, who creates and controls them?

Monica Kaminska: One of the things that our group does is they interview bot writers.

It seems to be a variety of individuals that tend to write these bots, not always for financial gain and there are different motivations for writing bots. But there’s also, and this again is just something that I’m aware of, but when it comes to large botnets, so effectively armies of bots, there are big underground marketplaces for these things.

So basically the way we define political bots is they operate over social media platforms and they seek to really influence public opinion. So they might be interacting with people they might be retweeting content and they’re the ones that we’re most concerned with.

What is really there that can really manipulate people’s opinions? Do you think it’s just the mere fact that many people, users for example, twitter users, they don’t differentiate between junk news and real news?

Monica Kaminska: The more something is retweeted, the more it’s liked, the more legitimate it seems, the more mainstream it seems. So basically the way that this sort of computational propaganda works is it tries to make information look more grassroots. When in fact what they’re actually doing is megaphoning this information, it’s artificial. It’s not really people that are liking this post or sharing this post it’s fake accounts or bot accounts, it’s all automated. So they’re trying to make very marginal views look more mainstream, which is problematic. The real solution I think, or one of the solutions would be to make people more aware of this phenomenon of misinformation.

Photo Credit: Angelina Kariakina and Monica Kaminska/Hromadske

The more conscious people become of these things being shared and how to verify whether something is legitimate or not, the less of a problem it will be. But like you say, on the one hand yes you have to talk about and you have to share it and mainstream media has to sort of contribute to this conversation and deconstruct it. Will that mean that this news becomes more popular and people believe it? I hope not. I hope people will say oh yes, this is ridiculous.

Social media became very toxic for the past two maybe three years, of course there is a certain context to what is happening in the Ukrainian segment of Facebook and in Ukraine itself we have a war, we have a complicated political situation and there is a lot of resentment. This is why opinions are different, there are a lot of fights and discussions. Is there something else that makes social media so toxic and aggressive?

Monica Kaminska: In general the pace of technological change and how people have adapted to it is a really interesting problem that needs to be explored further.

It’s interesting what you say about Facebook in Ukraine because that to me is something quite new. Obviously everyone has a different Facebook feed so your perspectives might be completely different to someone else’s. And I think the recent elections really exposed that because you think that public opinion is perhaps in favour of one thing and then the election goes the other way and you’re thinking, why did that happen? Well you realize that your feed was completely different and you didn’t have much of an idea of what people were actually talking about.