The start of the new year is generally a cause for celebration, but this may not be the case for a number of social media companies in Germany. 2018 rang in a new era in Germany in terms of the nation’s laws around hate speech, and on January 1, the country began enforcing strict regulations that could result in fines of up to $60 million if such posts are not removed within 24 hours of being flagged. These new laws could affect a number of major players in the social media and media space, including Facebook, Twitter, and YouTube.
Any social network or media company that boasts more than two million members will be on the hook under the new provisions, which means that folks like Reddit, Tumblr, Vimeo, Flickr, and even Russian social network VK will likely be affected.
While the Netzwerkdurchsetzungsgesetz (NetzDG) law was actually passed last summer and went into effect in October 2017, Germany gave companies until the end of the year to properly equip themselves to address hate speech reports. But now, three months later, the nation expects large social networks to have the tools they need to combat fake news, racist posts, and other bigoted messages on public platforms.
A number of social media sites have already attempted to cut down on the spread of certain fallacious reports on their platforms. Facebook, for example, rolled out its fake news identification tools at the beginning of 2017, and claimed that its efforts were already having a mediating effect. Journalists, however, weren’t so sure about Facebook’s self-reported success rates.
Under NetzDG, however, the stakes will be much higher. And not everyone is thrilled about the stringent new laws. Some in Germany (and around the world) worry that the provisions could result in censorship or infringe upon free speech. But Germany is far from the only country to criticize social media platforms for their role in spreading false information and otherwise unsavory material — lawmakers in the U.K. for example, have said that these networks are “shamefully far” from adequately addressing hate speech and problematic content.
“We’re committed to being part of the solution to illegal hate speech and extremist content online — around the world, and in Germany, working within its new legal framework,” a YouTube spokesperson told CNET in an emailed statement. “We’ll continue to invest heavily in teams and technology to allow us to go further and faster in removing content that breaks our rules or German law, and by working with government, law enforcement, civil society groups, and other companies.”
- What is Section 230? Inside the legislation protecting social media
- TikTok took down over 104 million videos in the first half of 2020
- Are deepfakes a dangerous technology? Creators and regulators disagree
- What the biggest tech companies are doing to make the 2020 election more secure
- The 50 best movies on Netflix right now