Research shows the emergent video platform can recommend conspiracy theories and other harmful content more often than not.

“I’m not really expecting things to ever be what they were,” says Sarah. “There’s no going back.” Sarah’s mother is a QAnon believer who first came across the conspiracy theory on YouTube. Now that YouTube has taken steps toward regulating misinformation and conspiracy theories, a new site, Rumble, has risen to take its place. Sarah feels the platform has taken her mother away from her.

Rumble is “just the worst possible things about YouTube amplified, like 100 percent,” says Sarah. (Her name has been changed to protect her identity.) Earlier this year, her mother asked for help accessing Rumble when her favorite conservative content creators (from Donald Trump Jr. to “Patriot Streetfighter”) flocked from YouTube to the site. Sarah soon became one of 150,000 members of the support group QAnon Casualties as her mother tumbled further down the dangerous conspiracy theory rabbit hole.

Between September 2020 and January 2021, monthly site visits to Rumble rose from 5 million to 135 million; as of April, they were sitting at just over 81 million. Sarah’s mother is one of these new Rumble users, and, according to Sarah, is now refusing to get the Covid-19 vaccine. Explaining her decision, says Sarah, her mother cites the dangerous anti-vax disinformation found in many videos on Rumble.

Rumble claims that it does not promote misinformation or conspiracy theories but simply has a free-speech approach to regulation. However, our research reveals that Rumble has not only allowed misinformation to thrive on its platform, it has also actively recommended it.

If you search “vaccine” on Rumble, you are three times more likely to be recommended videos containing misinformation about the coronavirus than accurate information. One video by user TommyBX featuring Carrie Madej—a popular voice in the anti-vax world—alleges, “This is not just a vaccine; we’re being connected to artificial intelligence.” Others unfoundedly state that the vaccine is deadly and has not been properly tested.

Even if you search for an unrelated term, “law,” according to our research you are just as likely to be recommended Covid-19 misinformation than not—about half of the recommended content is misleading. If you search for “election” you are twice as likely to be recommended misinformation than factual content.

Courtesy of Ellie House, Isabelle Stanley and Alice Wright; Created with Datawrapper

The data behind these findings was gathered over five days in February 2021. Using an adaptation of a code first developed by Guillaume Chaslot (an ex-Google employee who worked on YouTube’s algorithm), information was collected about which videos Rumble recommends for five neutral words: “democracy,” “election,” “law,” “coronavirus,” and “vaccine.” The code was run five times for each word, on different days at different times, so that the data was reflective of Rumble’s consistent recommendation algorithm.

Over 6,000 recommendations were manually analyzed. There can be disagreements about what can and cannot be classed as misinformation, so this investigation erred on the side of caution. For example, if a content creator said “I won’t take the vaccine because I think there might be a tracking chip in it,” the video was not categorized as misinformation. Whereas if a video stated “there is a tracking device in the vaccine,” it was. Our conclusions are conservative.

Of the five search terms used, Rumble is more likely than not to recommend videos containing misinformation for “vaccine,” “election,” and “law.” Even for the other two words “democracy” and “coronavirus,” the likelihood of Rumble recommending misleading videos remains high.

This data was tracked almost a year into the pandemic, after more than 3 million deaths worldwide have made it far more difficult to maintain that the virus is fake. It’s possible that searching for “coronavirus” on Rumble would have resulted in much more misinformation at the start of the pandemic.

Recommendation algorithms play a significant role in determining what users watch. According to Samuel Woolley, director of propaganda research at the University of Texas at Austin’s Center for Media Engagement, algorithms “tend to lead people disproportionately toward extremist content.” Video streaming sites often promote misinformation and conspiracy theories, because it’s profitable. “Revenue is directly linked to the time people spend online,” Chaslot says. It’s “like heroin.” And most of the growth of extreme or misleading accounts is down to promotion from these sites—“users and creators have zero control over the algorithm.” Sarah believes that YouTube’s recommendation system introduced her mother to the QAnon conspiracy theory, and now Rumble’s is keeping her hooked.

When asked for comment, a Rumble spokesperson wrote via email, “Rumble has strict moderation policies banning the incitement of violence, illegal content, racism, antisemitism, promoting terrorist groups (designated by US and Canadian governments), and violating copyright, as well as many other restrictions.”

Rumble claims it is bipartisan, but it chose the 2021 Conservative Political Action Conference to debut its new livestream tool, where keynote speaker Donald Trump reiterated false claims that he’d won the 2020 election. Rumble also has powerful backers with deeply conservative views and high-profile right wing influencers who seek to use the site to disseminate factually inaccurate content. One of the company’s key investors is popular influencer Dan Bongino, who The New York Times lists as a misinformation superspreader.

Chris Pavlovski, Rumble’s founder and CEO, makes his conservative leanings known on social media. Recent tweets directly engage with conservative thinkers and commentators, such as the controversial professor Jordan Peterson and the constitutional lawyer Alan Dershowitz. When former congressman Ron Paul was chastised by YouTube for spreading medical misinformation, Pavlovski used his Twitter account to ask him to join Rumble. The company’s official account often retweets Pavlovski’s personal tweets, suggesting a close relationship between his views and Rumble’s commercial direction.

Although Rumble’s company accounts are private, it appears to be profiting from the shift of conservative content creators away from YouTube. In November, Pavlovski told Fortune that the company was financially “self-sustaining.” He regularly tweets using the hashtag #MakeTheSwitch, encouraging users to move their content onto his site. In January, after the Capitol riots, Rumble filed a $2 billion antitrust lawsuit against YouTube’s parent company, Google, alleging that the search engine purposefully promotes YouTube’s videos above Rumble’s. (Google denies what it dubs “baseless claims” by Rumble.)

Whatever Rumble’s official policy, the site’s content and active recommendations algorithm is having a negative impact on the lives of people like Sarah. Yet a debate rages on about how to regulate information on the internet.

Woolley says that sites like Rumble and Parler should not and cannot simply be taken off the internet, instead arguing that governments and organizations like the United Nations and NATO should regulate these spaces “in terms of specific issues.” If a site hosts extremist content or allows for electoral disinformation, Woolley says, “then those are things that they should regulate and litigate on.”

Imran Ahmed, CEO of the Center for Countering Digital Hate, supports a more hardline approach. “You can say whatever you want, but you can’t say it wherever you want,” he says, “and the freedom of association is just as fundamental as the freedom of speech. People have the right to say ‘I don’t want to do business with you.’” In the end, he adds, “regulation is coming, and all the social media companies have accepted that.”

Anti-regulation advocates of a “free speech” defense, says Woolley, take a “utopian view of how communication exists.” He says that “a lot of these digital platforms that claim to be all about ‘free speech’ are based upon a faulty premise to begin with, because there have always been limitations on speech.”

“The internet has been normalized and co-opted by powerful entities,” Woolley adds. “It’s not as if it’s just a grassroots organization.” For example, Parler has funding from the Mercer family, prominent US conservative lobbyists who were involved with Cambridge Analytica. “When you start to unpack some of the funding and power structures involved, you start to realize the internet is not the space these people are touting it to be,” Woolley says.

Chaslot agrees that “censoring or removing videos is counterproductive, because it pushes [those with extreme views] to other platforms.” To combat online misinformation, YouTube and sites like Rumble “should be more transparent about its algorithm.” Chaslot’s AlgoTransparency organization continues to campaign for this, but in the meantime, he says “we need regulation.”

Citing the misinformation that spread on Parler and Rumble and contributed to the insurrection at the US Capitol on January 6, Chaslot adds that making “a recommendation is not neutral; you have a stake in the game.”


More Great WIRED Stories