The Fulcrum extends coverage to pop culture

An internal Facebook report found that the social media platform’s algorithms – the rules its computers follow to decide what content you see – have enabled Eastern European-based disinformation campaigns to reach close. of half of all Americans heading into the 2020 presidential election, according to a report in Technology Review.

The campaigns produced the most popular pages for Christian and Black American content, and globally reached 140 million US users per month. Seventy-five percent of people exposed to the content had not followed any of the pages. People saw the content because Facebook’s content recommendation system integrated it into their feeds.

Social media platforms rely heavily on people’s behavior to decide what content you see. In particular, they monitor the content that people react to or “interact with” by liking, commenting and sharing. Troll farms, organizations that release provocative content, exploit this by copying high-engaging content and posting it as their own.

As a computer scientist who studies how large numbers of people interact using technology, I understand the logic of using the wisdom of crowds in these algorithms. I also see significant pitfalls in the way social media companies do it in practice.

Subscribe to the Fulcrum newsletter

From lions of the savannah to likes on Facebook

The concept of Crowd Wisdom assumes that using the signals of others’ actions, opinions, and preferences as a guide will lead to sound decisions. For example, collective predictions are normally more accurate than individual predictions. Collective intelligence is used to predict financial markets, sports, elections, and even epidemics.

Over millions of years of evolution, these principles have been encoded in the human brain in the form of cognitive biases with names such as familiarity, simple exposure, and ripple effect. If everyone is starting to run, you should start running as well; maybe someone saw a lion come and run could save your life. You might not know why, but it’s wiser to ask questions later.

Your brain picks up clues from the environment – including your peers – and uses simple rules to quickly translate those signals into decisions: follow the winner, follow the majority, copy your neighbor. These rules work remarkably well in typical situations because they are based on solid assumptions. For example, they assume that people often act rationally, many are unlikely to be wrong, the past predicts the future, etc.

Technology allows people to access the signals of more other people, most of whom are unfamiliar. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and suggesting friends to rankings. news feed publications.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news referral systems, have a strong popularity bias. When apps are driven by cues like engagement rather than explicit search engine queries, popularity bias can lead to unintended, negative consequences.

Social media like Facebook, Instagram, Twitter, YouTube, and TikTok rely heavily on AI algorithms to rank and recommend content. These algorithms take as input what you like, comment on, and share – in other words, the content you interact with. The goal of algorithms is to maximize engagement by finding out what people like and ranking it at the top of their feeds.


How social media filter bubbles work

Youtube

On the surface, that seems reasonable. If people like believable news, expert opinions, and funny videos, these algorithms should identify content of that quality. But the wisdom of the crowds makes a key assumption here: that recommending what’s popular will help high-quality content “bubble”.

We tested this hypothesis by studying an algorithm that ranks items using a mixture of quality and popularity. We have found that in general, popularity bias is more likely to reduce the overall quality of content. This is because engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, the engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality article is large enough, it will continue to grow.

Algorithms aren’t the only thing affected through engagement – they can affect people, too. Evidence shows that information is transmitted by ‘complex contagion’, which means that the more people are exposed to an idea online, the more likely they are to adopt and share it. When social media tells people that something goes viral, their cognitive biases kick in and translate into the overwhelming urge to pay attention and share it.

Crowds not so wise

We recently ran an experiment using a literacy app called Fakey. It is a game developed by our laboratory, which simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking information from trusted sources and for flagging unreliable articles for fact-checking purposes.

We have found that players are more likely to like or share and less likely to report articles from unreliable sources when players can see that many other users have interacted with those articles. Exposure to engagement metrics therefore creates a vulnerability.

The wisdom of crowds fails because it is based on the false assumption that the crowd is made up of diverse and independent sources. There may be several reasons why this is not the case.

First, due to the tendency of people to associate with like-minded people, their online neighborhoods are not very diverse. The ease with which social media users can get rid of those they disagree with pushes people into cohesive communities, often referred to as echo chambers.

Second, because many people’s friends are friends of each other, they influence each other. A famous experiment has shown that knowing what music your friends like affects your own stated preferences. Your social desire to conform skews your independent judgment.

Third, popularity signals can be played. Over the years, search engines have developed sophisticated techniques to counter “link farms” and other schemes aimed at manipulating search algorithms. Social media platforms, on the other hand, are just beginning to discover their own vulnerabilities.

People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks. They flooded the network to make it seem like a conspiracy theory or political candidate is popular, fooling both the platform’s algorithms and people’s cognitive biases. They even changed the structure of social networks to create illusions about majority opinions.

Reduce engagement

What to do? Technology platforms are currently on the defensive. They become more aggressive during elections by dismantling fake accounts and damaging disinformation. But these efforts can be compared to a mole game.

A different preventative approach would be to add friction. In other words, to slow down the process of disseminating information. High frequency behaviors such as automated tasting and sharing could be inhibited by CAPTCHA testing or fees. Not only would this reduce the possibilities for manipulation, but with less information people would be able to pay more attention to what they are seeing. It would leave less room for engagement bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine what content they serve you. Perhaps the revelations of Facebook’s knowledge of troll farms exploiting engagement will provide the necessary boost.

This article is republished from The Conversation under a Creative Commons license. Click here to read the original article.



Source link

Comments are closed.