Facebook has reinstated an iconic Vietnam War photo after repeatedly removing it for violating its restrictions on nudity. The company now claims that it is changing its review mechanism to ensure that the historic “napalm girl” photo can be shared freely in the future.
Facebook’s amendment could mark a precedent in regard to its blanket policy on nudity, which has been challenged in the past.
The photograph depicting a naked girl in the aftermath of a napalm attack during the Vietnam War was initially uploaded to the platform by Norwegian writer Tom Egelad. Facebook responded by quickly removing the image and suspending Egelad. When Norwegian newspaper Aftenposten picked up the story and shared it to Facebook with the corresponding image, it was met with the same response. “Any photographs of people displaying fully nude genitalia or buttocks, or fully nude female breast, will be removed,” read the notice from Facebook.
Outraged, the newspaper published a front-page open letter to the social network, in which it accused CEO Mark Zuckerberg of “abusing [his] power” as the “world’s most powerful editor.”
“I am worried that the world’s most important medium is limiting freedom instead of trying to extend it, and that this occasionally happens in an authoritarian way,” wrote Aftenpsoten’s editor-in-chief Espen Egil Hansen.
The backlash grew on Friday when Norwegian Prime Minister Erna Solberg claimed Facebook had also deleted the image from her own popular Facebook page. “What they do in removing such pictures, whatever their reasons, is to edit our common history,” remarked Solberg.
Earlier today, Facebook ended its public silence by responding to the controversy with a lengthy statement of its own. Breaking down its review policy, the company said: “An image of a naked child would normally be presumed to violate our Community standards, and in some countries might even qualify as child pornography.”
“In this case, we recognize the history and global importance of this image in documenting a particular moment in time,”it added. “Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook.”
Policing its massive social network, which now has approximately 1.65 billion users, is proving a daunting task for Facebook — the platform has already faced allegations of bias this year. Unlike Twitter, the social network has never claimed to be a bastion of free speech and expression, instead relying on strict rules and regulations concerning what people can or cannot share, as stated within its Community Standards. In its own words, the company is simply trying to block pornography and abuse.
“We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content — particularly because of their cultural background or age,” states Facebook in its guidelines on nudity. It adds, however, that nudity is permitted under the guise of “art,” such as “photographs of paintings, sculptures and other art that depicts nude figures.”
Increasingly, Facebook has shed its reliance on human editors to monitor what has become the world’s biggest social platform. Instead, it is is gradually automating the process, using algorithms to rank its revolutionary News Feed, tag photos, and curate its Trending news section.
In the past, Facebook has faced fierce opposition to its nudity policy. In 2008, the company was forced to lift its ban on photos of breast-feeding mothers. Instagram, which is owned by Facebook, previously claimed that its own, unwavering, ban on nudity directly reflects the conditions imposed by Apple’s App Store, which places age ratings on apps that permit adult material. Critics of the ban claim that a relatively unrestricted social platform such as Twitter continues to operate with a 4+ age rating on the App Store, despite containing images of hardcore pornography.
- Trump’s executive order could be a disaster for thwarting misinformation
- Facebook takes down viral ‘Plandemic’ coronavirus conspiracy video
- Twitter’s misinformation problem will never disappear as long as bots exist
- Trump signs executive order targeting social media companies
- The best photo-editing apps for Android and iOS