A warning: Conspiracy theories about covid are helping to disseminate antisemitic beliefs to a wider audience, warns a new report by anti-racist advocacy group Hope not Hate. The report says that the pandemic has not only revived interest in the “New World Order” conspiracy theory of a secret Jewish-run elite that aims to run the world, but far-right activists have also worked to convert peoples’ anti-lockdown and anti-vaccine beliefs into active antisemitism.
Worst offenders: The authors easily managed to find antisemitism on all of the nine platforms they investigated, including TikTok, Instagram, Twitter, and YouTube. Some of it uses coded language to avoid detection and moderation by algorithms, but much of it is overt and easily discoverable. Unsurprisingly, the authors found a close link between the amount of antisemitism on a platform and how lightly or loosely it is moderated: the laxer the moderation, the bigger the problem.
Some specifics: The report warns that messaging app Telegram has rapidly become one of the worst offenders, playing host to many channels that disseminate antisemitic content, some of them boasting tens of thousands of members. One channel that promotes the New World Order conspiracy theory gained 90,000 followers since its inception in February 2021. However it’s a problem on every platform. Jewish creators on TikTok have complained that they face a deluge of antisemitism on the platform, and are often targeted by groups who mass-report their accounts in order to get them temporarily banned.
A case study: The authors point to one man who was radicalized during the pandemic as a typical example of how people can end up pushed into adopting more and more extreme views. At the start of 2020 Attila Hildmann was a successful vegan chef in Germany, but in the space of just a year he went from being ostensibly apolitical to a social media influencer who “is just asking some questions” to spewing hate and inciting violence on his own Telegram channel.
What can be done: Many of the platforms investigated have had well over a decade to get a handle on regulating and moderating hate speech, and some progress has been made. However, while major platforms have become better at removing antisemitic organizations, they’re still struggling to remove antisemitic content produced by individuals, the report warns.