Skip to main content

Facebook: We were wrong to remove emotional father-son photo

facebook fatherson image ban father son
Heather Whitten/Facebook
Facebook found itself in a bit of a sticky situation over its decision to remove this image from the social network not once, but twice, despite the image not specifically breaking any of the site’s community standards. The image in question shows a father in the shower comforting his young son, who was ill with salmonella poisoning. It was captured by photographer, and the child’s mother, Heather Whitten, and shared with her friends and family on Facebook.

Whitten took to Facebook to complain, posting the image for a third time along with a long description of the events depicted in the image and her frustration with the it being removed, despite not breaking any rules. The image, again, received a ton of support and, as of this writing, sits at more than 147,000 likes, 33,000 shares, and 21,000 comments.

“My family may be different than yours. But, that doesn’t make your way right or my way wrong.” Witten explained in the post, “You may never take images of your family like I do… you may never share images of your family like I do. But, that doesn’t give you the right to silence my voice. To take away my right to share our experiences in an uncensored way.”

The image has received an outpouring of support from the greater Facebook community, and hundreds of thousands of likes and comments, right from the very beginning. Yet, a vocal minority has questioned the image and the appropriateness of displaying such content in public. They have a right to that opinion, but the mass reporting of the image caused it to be removed twice by Facebook’s enforcement team, without any notice or reason. The only conclusion is that it had broken one of the social networks community standards, but as we noted already, it hadn’t.

It is just the latest example of Facebook’s enforcement team failing to take the context of an image into account after receiving reports about the content shown within it. As we mentioned above, none of Facebook’s very clearly spelled-out community standards were violated by the image. But Facebook, to its credit, is admitting its mistake, and, in the company’s defense, the photo could easily be misconstrued when viewed without context. “This photo was mistakenly removed by our team and does not violate our Community Standards. We are sorry for this mistake and have restored the photo to the page,” a Facebook spokesperson said in a statement to Digital Trends.

Facebook, in error, removed this image from the social network twice due to mass reporting. Heather Whitten

These sort of things are bound to happen when a large group of people congregates in the same place. Points of view will differ and opinions will contend with each other. We don’t envy Facebook’s enforcement team, who have to go through the content that is reported on a daily basis and make decisions one way or another. It appears that in this instance, although the image was removed (twice), the outcome has been favorable with the image being restored.

But in many situations, the result is less favorable. All we can hope for is that Facebook improves its training and education so that in the future, enforcement staff can make the correct call.

Editors' Recommendations

Anthony Thurston
Anthony is an internationally published photographer based in the beautiful Pacific Northwest. Specializing primarily in…
Facebook details plans to combat election interference on the platform
facebook independent oversight board mark zuckerberg  viva tech start up

Facebook CEO Mark Zuckerberg held a conference call on Monday, October 21, to discuss abuse and election interference on the platform.

"The bottom line here is that elections have changed significantly since 2016, and Facebook has changed too,” Zuckerberg said on the call before detailing some of the types of threats Facebook has started to see on the platform.

Read more
We now have scientific proof that quitting Facebook makes you less depressed
Facebook

There’s now scientific proof of what we already knew: Quitting Facebook is beneficial for your health. 

A new report titled “The Economic Effects of Facebook” was published this week online in the journal Experimental Economics. The study looked at two groups of Texas A&M student participants in 2017: Those who went off Facebook for a week and those who remained on the social media platform during that period. The findings, discussed in Nieman Lab, were that the students who were off of Facebook consumed less news and also reported greater overall well-being. 

Read more
Facebook likely knew about Cambridge Analytica much earlier than we thought
Zuckerberg Testimony Congress

Mark Zuckerberg appears before Congress on April 10, 2018. Jim Watson/AFP/Getty Images
Newly-released documents suggest that Facebook knew about the Cambridge Analytica scandal much earlier than we originally thought.
Documents suggest that Facebook knew the company was gathering user profile data three months before the press revealed that the firm was using profile data to target voters during the 2016 elections, CNBC reports.
Internal emails released by Facebook suggest that the social network had concerns about Cambridge Analytica as well as several other companies that were using data in ways that potentially violate Facebook policies as early as September 2015. Those documents suggested that Facebook employees planned to reach out to the companies in question to determine how they were using Facebook’s data. One email sent on September 30, 2015 speculatd that "these apps' data-scraping activity is likely non-compliant" with Facebook's policies. 
Facebook made a joint statement Friday about the issue along with District of Columbia Attorney General. It also released a separate statement explaining the documents, which it says hold the potential for confusion -- confusion it wanted to preemptively clear up.

"There is no substantively new information in this document and the issues have been previously reported," a blog post posted by Paul Grewal, Vice President and Deputy General Counsel for Facebook reads."As we have said many times, including last week to a British parliamentary committee, these are two distinct issues. One involved unconfirmed reports of scraping — accessing or collecting public data from our products using automated means — and the other involved policy violations by Aleksandr Kogan, an app developer who sold user data to Cambridge Analytica. This document proves the issues are separate; conflating them has the potential to mislead people."

Read more