icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
11 Nov, 2016 19:09

Facebook turns off ‘racist’ ad filter after backlash

Facebook turns off ‘racist’ ad filter after backlash

Facebook is getting rid of the controversial “ethnic affinity” filter for advertisers after accusations that it enabled Jim Crow-like discrimination and violated federal laws, according to a pending lawsuit.

The controversial filter was first discovered in late October by researchers at ProPublica, who tried – and succeeded – in placing a housing ad excluding African-Americans, Asians and Hispanics, using the “ethnic affinity” feature built into Facebook’s advertising algorithms.

The social media giant initially argued the filters were “intended to be inclusive” and help advertisers reach people using “multicultural advertising,” according to spokespeople at Menlo Park.

On Friday, however, Facebook’s Vice President of US public policy Erin Egan announced that the company would scrap the filter for categories where it might violate federal non-discrimination rules.

“We are going to turn off, actually prohibit, the use of ethnic affinity marketing for ads that we identify as offering housing, employment and credit,” Egan told USA Today.

Facebook will also require advertisers to confirm they will not place discriminatory ads, and offer educational materials to help advertisers understand their obligations under law, Egan said.

The decision came after meetings with New York Attorney General Eric Schneiderman, representatives of the Congressional Black Caucus and the Congressional Hispanic Caucus, and officials from the Department of Housing and Urban Development. HUD communicated to Facebook their “serious concerns” about the filter.

"In light of the concerns that have been raised, we are taking this step," said Egan.

Last week, a group of Facebook users filed a lawsuit against Facebook and sought class-action status, arguing that the filter violated the 1964 Civil Rights Act and the 1968 Fair Housing Act.

"There is no mechanism to prevent ad buyers from purchasing ads related to employment/housing and then excluding based on these illegal characteristics," the plaintiffs wrote in the complaint, filed in the US District Court for the Northern District of California. Facebook said the lawsuit was without merit.

Some statisticians, however, believe that ethnic filters may be a vital part of computer programming in order to achieve fairness. According to Seth Neel, a doctoral candidate at the University of Pennsylvania, “there are also good reasons why this type of targeting might not always be racist, and could even be necessary to prevent discrimination.”

To ensure fairness of outcome, computer algorithms need to include information about ethnicity or race, otherwise they would make determinations inherently biased towards the majority, Neel argued in an essay published Thursday by Wired Magazine.

Podcasts
0:00
23:13
0:00
25:0