Skip to main content

Social Feed: Facebook asks for hate speech help; YouTuber in jail for fake news

Social media is a fluid technology — nearly every day, the major networks are announcing a big change, coming under fire for the latest controversy or moving forward in smaller ways. Social Feed is a collection of those smaller changes that you may have missed with this week’s biggest news — like Facebook’s long list of F8 announcements, Twitter’s news update, Snapchat’s slowed growth, group video calls for What’s App and Instagram, and Instagram’s payments test. Find Social Feed every weekend for the latest social news tidbits.

Just how long did Facebook users watch that video? New metrics rolling out for creators

Creatives on Facebook are gaining enhanced tools to see just how long their video held viewer’s attention. On Thursday, May 3, Facebook shared additional metrics added to the existing video retention data inside Pages. The enhanced tool now includes a comparison between users that follow the Page and users that don’t, along with adding gender demographics. The update will also allow for a more detailed look at the charts with a zoom option. Facebook also said they fixed a bug that caused some inaccurate data for videos past two minutes.

YouTuber gets a month in prison for fake news

Malaysia has a new law against fake news — and now the first person prosecuted under that law is incarcerated, according to The Guardian. YouTuber Salah Salem Saleh Sulaiman pled guilty after posting a video misrepresenting police response time to a death in Malaysia’s capital — claiming 50 minutes rather than the eight police reported. The YouTuber apologized during the hearing. The judge ordered him to pay a fine of 10,000 ringgit (about $2,537), but the Guardian said he opted for a month in jail “because he could not pay.”

Facebook test asks users for help spotting hate speech


Facebook is continually looking for ways to spot posts that are against community guidelines, and a test that users spotted this week suggest the social media giant is expanding tools to spot hate speech. Users tweeted screenshots of a question for “Does this post contain hate speech” under every post, which Facebook has confirmed is a test. As just a test, it’s unclear if Facebook is trying to use those responses to train an artificial intelligence system for spotting hate speech, one of the categories more tough to teach an A.I., or if the network is simply creating an easier way for users to report hate speech.

Facebook is getting audited for biases

After facing claims of political bias and lawsuits for discrimination, Facebook is addressing those claims by welcoming third-party audits on those allegations. The company will undergo both a civil rights audit and a search for political bias run by a former Republican senator. There is no word yet on when those audits will be completed. The audit follows continued claims of conservatives accusing the platform of unfairly limiting post reach. Facebook is also facing a lawsuit for including audience metrics such as race in housing ads.

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Facebook tries to demystify your news feed with built-in transparency tool
facebook why am i seeing this post v1 copy

Facebook’s news feed is organized by complex algorithms, but now the social media giant will tell users why that specific post popped up in the feed. On Sunday, March 31, Facebook shared an upcoming tool, ‘Why Am I Seeing This,’ that’s designed to demystify the reasons certain posts appear in your feed over others. While the tool is based on the similar option, ‘Why Am I seeing This Ad?,’ Facebook says it’s the first time the company has built a news feed transparency tool directly into the app itself.

To use the upcoming tool, users can tap on the “...” menu and choose the “Why am I Seeing This?” option. The pop-up page that follows will list why the post appeared in your feed, as well as reasons why the post appeared before others. For example, Facebook may tell you that you’re seeing the post because you are friends with the user or part of a Group. Facebook will also list if you’re seeing the post because you’ve interacted with that user’s posts more than others, if you tend to react more often to a certain media type like photos or videos, or if the post is simply more popular than others.

Read more
Facebook’s ‘trustworthiness’ scores weed out fake news, false user reports
facebook echo show competitor 1 hacker way

Facebook probably has a scorecard for you -- in a recent interview with the Washington Post, Facebook product manager Tessa Lyons shared that the network has a system for determining which users are more trustworthy. The “score” is used to push potential fake news to the top of the list for fact-checking organizations.

Lyons explains that the score works to classify users that have reported posts on Facebook using the reporting tool. Some users, she says, report posts not because they are false but because they don’t agree with the content. Users that regularly flag posts that actually contain fake news will have a high score. Users that flag posts that waste fact-checker time have a lower score.

Read more
Americans believe that 40 percent of the news is fake
MacOS Mojave News

In 2016, our vocabularies grew to include the term "fake news," and the concept has since taken root in the collective American psyche. A new survey conducted by the Gallup/Knight Foundation found that overall, Americans believe that 39 percent of the news they see on TV, hear on the radio, or read online or in papers is misinformation -- that is, news that is intentionally meant to lead folks astray.

Certain demographics were more likely to be mistrustful of the news, namely Republicans and those who are less educated (having a high school education or less). In total, the survey asked questions of 1,440 randomly recruited Americans.

Read more