Amid scandals over ad discrimination and hate speech, Facebook is launching a series of changes. Civil rights leader Laura Murphy recently finished the company’s second civil rights audit, and referred to the changes as a “systematic, cross-functional framework to address these issues over time.” Critics, however, have already voiced concerns that the platform isn’t doing enough to tackle the issues.
The report, the second following an initial report in December 2018, focuses on the social network’s enforcement against hate speech, discrimination in ads, and tackling of misinformation. A third and final report is expected to be released in early 2020. As part of the report, Murphy talked with more than 90 civil rights organizations, as well as Facebook leaders and policy teams. The report both identifies the changes Facebook is making and areas for further improvement.
The report recognizes and encourages Facebook’s efforts to fight hateful content, including white supremacism. The platform expanded a ban on white nationalism this spring, but the audit urges the network to tackle white nationalism ideology and posts that may not include easy-to-flag key terms like “white nationalism” and “white separatism.”
Facebook COO Sheryl Sandberg says the company is addressing those suggestions by working to identify hate slogans and other symbols tied to white nationalism and white separatism in order to remove more content falling under those categories. Facebook also recently banned events designed to intimidate or harass, she says.
Facebook is working on policies and efforts to find and remove such content. As part of the discussion with civil rights groups, the network also noted that some content designed to fight hate and discrimination can be incorrectly removed by the software. The company is working to expand its human content moderator team with a group that focuses solely on hate speech. That effort will start with a pilot program in the U.S.
The audit also focuses on discrimination inside the ads platform. Last year, Facebook removed several targeting options for all advertisers, including race, ethnicity, sexual orientation, and religion. Upcoming changes will limit ad categories even further for ads related to housing, jobs, and credit, — an area that sparked a lawsuit — removing options like age, gender, and zip code.
A third set of changes treats the 2020 census as an election by recognizing the potential for census interference to alter how federal funds are distributed. Facebook will launch a new policy regarding census information this fall. While the company is building a team to create that policy, enforcement will be done by artificial intelligence software. Elections will see further changes, including a new policy for political ads in place and enforced before the 2019 gubernatorial elections.
Finally, Facebook is making the civil rights task force created with the previous audit a permanent change. The group, which includes several leaders across the company, will continue to examine areas like content policy, privacy, and elections, Sandberg said. Additional training related to civil rights will also be added for the company’s leaders.
“Since I initiated my audit of Facebook, I’ve observed a greater willingness for Facebook to listen, adapt, and intervene when it comes to civil rights,” Murphy wrote in the audit report. “I do believe the company is in a different place today than it was a year ago on these issues — I’ve seen employees asking the right questions, thinking about things through a civil rights lens, and poking holes in things on the front end of products and policies. While this audit report shows Facebook is making progress, there is still much more work to do—and Facebook acknowledges this. As the company’s work in this area continues, I want this effort to be even more robust and deeply embedded in the company’s DNA.”
While the report includes several changes Facebook is making, critics suggest those changes simply aren’t enough. Leaders from the Center for American Progress suggest the report doesn’t do enough to call out deficiencies in removing hate and suggests Facebook needs more transparency on the process.
After the shootings in Christchurch, New Zealand, were broadcast live on the network, the organization Muslim Advocates said Facebook isn’t doing enough to tackle threats of violence. The group suggests the same technology that removes child porn and ISIS rhetoric could be implemented against white nationalist content.
Organizations involved in the talks also pointed out a lack of diversity from Facebook itself, with more suggesting the audit doesn’t go far enough.