Facebook turns off ‘racist’ ad filter after backlash
The controversial filter was first discovered in late October by researchers at ProPublica, who tried – and succeeded – in placing a housing ad excluding African-Americans, Asians and Hispanics, using the “ethnic affinity” feature built into Facebook’s advertising algorithms.
The social media giant initially argued the filters were “intended to be inclusive” and help advertisers reach people using “multicultural advertising,” according to spokespeople at Menlo Park.
On Friday, however, Facebook’s Vice President of US public policy Erin Egan announced that the company would scrap the filter for categories where it might violate federal non-discrimination rules.
“We are going to turn off, actually prohibit, the use of ethnic affinity marketing for ads that we identify as offering housing, employment and credit,” Egan told USA Today.
Facebook will also require advertisers to confirm they will not place discriminatory ads, and offer educational materials to help advertisers understand their obligations under law, Egan said.
The decision came after meetings with New York Attorney General Eric Schneiderman, representatives of the Congressional Black Caucus and the Congressional Hispanic Caucus, and officials from the Department of Housing and Urban Development. HUD communicated to Facebook their “serious concerns” about the filter.
"In light of the concerns that have been raised, we are taking this step," said Egan.
Last week, a group of Facebook users filed a lawsuit against Facebook and sought class-action status, arguing that the filter violated the 1964 Civil Rights Act and the 1968 Fair Housing Act.
"There is no mechanism to prevent ad buyers from purchasing ads related to employment/housing and then excluding based on these illegal characteristics," the plaintiffs wrote in the complaint, filed in the US District Court for the Northern District of California. Facebook said the lawsuit was without merit.
Opinion: Facebook's race-targeting ads could actually be necessary to prevent discrimination https://t.co/37ZTRAucGM— WIRED (@WIRED) November 10, 2016
Some statisticians, however, believe that ethnic filters may be a vital part of computer programming in order to achieve fairness. According to Seth Neel, a doctoral candidate at the University of Pennsylvania, “there are also good reasons why this type of targeting might not always be racist, and could even be necessary to prevent discrimination.”
To ensure fairness of outcome, computer algorithms need to include information about ethnicity or race, otherwise they would make determinations inherently biased towards the majority, Neel argued in an essay published Thursday by Wired Magazine.