Skip to main content

According to a new report, Twitter is a breeding ground for terrorism and hate speech

twitter terrorismRegular Twitter users can attest to the fact that a short and simple hateful comment can escalate and cause mass hysteria – just ask the public figures who’ve had to issue apologies to their hordes of followers for poorly thought out tweets. There are also those who are not sorry at all. This all supports what a recent report by The Simon Wiesenthal Center (SWC) – a human rights organization – says on the issue of online hate: Out of the many social media platforms available on the Internet today, it’s easiest to propagate hate and terror on Twitter.

The SWC annually prepares a Digital Terrorism and Hate report that follows and reveals the actions of the many hate groups across various social media outlets. This year the group issued “grades” to social networking sites based on online hate activity. To streamline efforts to thwart attempts of terrorists and hate groups to manipulate digital technologies for evil, the SWC made their report available through a password-protected app that can be accessed by government agencies, policy makers and law-enforcement outlets. It amasses information from thousands of websites, social networking pages, forums, games, and apps as well as updates registered users about up-and-coming terrorist threats.

Anybody with overexposure to Facebook can find pages dedicated to hate and terrorism, but according to SWC Research team member Rick Eaton, Facebook actually does a reasonably good job of either removing them or, in some cases, catching them as they appear. The only hurdle they have related to this problem is actually finding offensive content. “Once [Facebook is] notified of the existence (of hateful content) – by ourselves or other groups or users – they do get them off [the site],” Eaton confirmed with Digital Trends. “Some things such as ‘Holocaust Denial’ are considered ‘discussion’ and are usually left alone unless they get out of hand which usually happens at some point. We give them an A-.”

What about YouTube? Eaton said they rated the video-streaming site a C. “They have taken off some terrorism material such as Anwar Al Awlaki’s videos, but usually only after it has become a major issue (Awlaki was killed in September 2011 by an American drone in Yemen),” continued Eaton. Offensive videos also have a way of reappearing on the site, and as far as SWC is concerned, YouTube has yet to find a way to directly deal with that problem. “There is an immense amount of how-to stuff on YouTube, not necessarily posted by Jihadists, but [they are] all the same. Cell phone detonators, how to make flamethrowers, napalm … it is all there. Also a lot of things in Arabic and other languages promoting terrorism, as well as an immense amount of hate related material.”

If you thought that was bad, it gets worse: According to the report, in the last six months, Twitter has exploded (no pun intended) with so much content that classifies as hate and terrorism that the SWC had to give it an F rating. Terrorist groups, hate groups, and other extremists regularly use Twitter to upload links to many sites, including an English magazine Inspire, and its offshoot, The Lone Mujahid Pocketbook (LMP). Both contain many how-to ideas, including how to construct a pressure-cooker bomb, similar to the one used in the Boston Marathon bombings.  According to Eaton, LMP was released around two weeks before the bombing at Boston happened, and a blog containing a link to the PDF was announced on Twitter.  It was also around that time that Al Qaeda Airlines #4 – a magazine in a mix of English and Arabic – was released; it’s over 600 pages and includes a detailed explanation of Hydrogen Cyanide (the same as Zyclon B used in Nazi gas chambers), and how it could be used to attack a building. Below is an image from the magazine, followed by a translation courtesy of the SWC:

Image courtesy of The Simon Wiesenthal Center
Image courtesy of The Simon Wiesenthal Center Image used with permission by copyright holder

“In this issue of our magazine, we will talk about one of those scenarios which the West fears. What we mean here is a study regarding an individual attack which would be carried out by the mujaheddin (the terrorists) against a crusader target, which will cause a lot of damage; and understanding the mistakes which occur during the attack.

The scenarios are many and diverse. As long as there is a country which takes part in a war against Islam and Muslims, we will consider it a target and [a target for] a possible attack which will take place, in the near or far future.

To return then to the proposed scenario, luckily, it is one of the superior attacks. The attack which was chosen is [an attack using] cyanide hydrogen, since it is easy for a regular person to find the essential materials and [similarly easy] to turn it into an active poisonous gas, as was shown in the prior pages of this magazine. The target is a large office building which includes offices. This building may be an Embassy or a federal investigations office (FBI) or a weapon company which helps kill Muslims. The building itself is less important, as long as it is locked and as long as it is a closed building and not open. The reason is [to retain] the gas’s ability to be lethal and its lethal concentration. [The gas looses this ability] after no longer than ten minutes, since it rises quickly to the upper levels of the atmosphere.”

“We find literally dozens of new profiles each week … to my knowledge, Twitter has only removed a couple of feeds,” says Eaton, explaining Twitter’s F rating. “The one terrorist site removed was Al Shabaab Mujahideen, a Somali group that used Twitter to threaten the lives of hostages they hold. The feed as removed in December of last year and reappeared about two weeks later.”

The report also mentions other social media outlets that are minimally used for propagating global hate and terrorism, namely Tumblr, Instagram, and VK.com, a Russian website similar to Facebook.

So what can we regular social media users do to help make the Internet a better place (and to restore humanity’s decency)? If you ever come across a site or page that has even a smidgin of terrorist or hate potential, do not hesitate to hit the Report button. For the time being, that’s all you or I can do – Twitter, on the other hand, finds itself in a more difficult position. The network has tried to walk the fine line between censorship and propagating damaging material, and it’s a difficult line to walk. It would appear it’s only going to get tougher for Twitter to decide what to do with its endless news feed. 

Editors' Recommendations

Jam Kotenko
Former Digital Trends Contributor
When she's not busy watching movies and TV shows or traveling to new places, Jam is probably on Facebook. Or Twitter. Or…
Google brings back humans to take over moderating YouTube content from A.I.
ios youtube update

Google is bringing back human moderators to oversee YouTube content, taking over from automated systems that were given more responsibilities at the onset of the COVID-19 pandemic.

YouTube revealed in late August that in the three months prior, 11.4 million videos have been removed from the platform for violating its Community Guidelines. This is the highest number of videos taken down from YouTube over a three-month period since the service was launched in 2005, and it was attributed to the higher reliance on A.I. as the pandemic prevented human reviewers from going to work.

Read more
Facebook will stop accepting new political ads in the week before Election Day
Trump with Facebook CEO Mark Zuckerberg stylized image

Facebook says it won’t accept new political advertising in the week leading up to the U.S presidential election on November 3. The move is part of a broader set of measures Facebook is announcing today to tackle election interference and voter misinformation.

“The U.S. elections are just two months away, and with COVID-19 affecting communities across the country, I'm concerned about the challenges people could face when voting,” CEO Mark Zuckerberg wrote in a Facebook post. “I'm also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”

Read more
YouTube reveals why it’s been removing far more videos than usual
YouTube

In the space of just three months, YouTube removed a staggering 11.4 million videos for violating its Community Guidelines, the company revealed on Tuesday, August 25.

The content, which may have contained subject matter such as pornography, incitement to violence, harassment, and hate speech, was taken down between April and June 2020. The removed videos may also have included scams, spam, and misleading content.

Read more