Skip to main content

23 June 2020

YouTube's design contributes to the spread of racist misinformation

YouTube’s user interface design may be actively contributing to the spread of racist, antisemitic misinformation and bigotry, new research shows.

YouTube ranks comments on videos by how popular they are among people who have viewed the same videos
Image: Shutterstock

YouTube’s user interface design may be actively contributing to the spread of racist, antisemitic misinformation and bigotry, new research shows.

Critical thinkers could be prevented from challenging harmful misinformation on the site, research by Dr Daniel Allington, Senior Lecturer in Social and Cultural Artificial Intelligence at King’s College London, suggests.

YouTube ranks comments on videos by how popular they are among people who have viewed the same videos. If comments that represent a particular point of view get more ‘likes’, they rise up the rankings, pushing other points of view out of sight.

With co-author, Tanvi Joshi, Dr Allington analysed over 1000 comments on a YouTube video in which professional conspiracy theorist David Icke purported to expose members of a non-existent ‘Rothschild Zionist’ secret society.

Icke’s statements in the video were very similar to claims which antisemitic right-wing extremists have been making since the early 20th century. User comments that agreed with Icke’s point of view, or that expressed antisemitic opinions of their own, got many more ‘likes’ than user comments that disagreed with Icke. The result was that comments that challenged Icke’s worldview were buried beneath a mass of comments that accepted his fantasies as reality.

This research helps us to understand one of the reasons why YouTube has become such a breeding ground for conspiracy theories. Although most people can see straight through conspiracy theorists like David Icke, there are some who get taken in by their fantasies. By moving popular comments to the top, YouTube enables conspiracy believers to silence anyone who challenges their simplistic view of the world

Dr Allington

Dr Allington argues that while social media companies such as YouTube should certainly be doing more to drive harmful misinformation off their platforms, that alone will not be enough. “Social media platforms also have to think about how their user design encourages people to interact with content, and with each other,’ he said.

“Turning everything into a popularity contest does nothing to encourage critical thinking.”

You can read the full paper here in Journal of Contemporary Antisemitism.

In this story

allingtondaniel

Reader in Social Analytics