Meta agrees to change ad technology compared to US

SAN FRANCISCO — Meta on Tuesday agreed to change its ad technology and pay a $115,054 fine in a settlement with the Justice Department over claims that the company’s ad systems discriminated against Facebook users by restricting who See housing ads on the platform based on their race, gender and zip code.

As part of the deal, Meta, the company formerly known as Facebook, said it would change its technology and use a new computerized method aimed at periodically verifying that those who are targeted and eligible to receive housing ads are actually receiving them see ads. Dubbed the “variance reduction system,” the new method uses machine learning to ensure advertisers serve housing-related ads to specific protected groups of people.

“Meta will — for the first time — change its ad delivery system to address algorithmic discrimination,” Damian Williams, a US Attorney for the Southern District of New York, said in a statement. “But unless Meta can demonstrate that it has modified its delivery system sufficiently to avoid algorithmic bias, this office will proceed with the litigation.”

Facebook, which became a business behemoth by collecting its users’ data and allowing advertisers to target ads based on an audience’s characteristics, has faced complaints for years that some of these practices are biased and discriminatory. The company’s ad systems have allowed marketers to choose who saw their ads using thousands of different characteristics, which also allowed those advertisers to exclude people who fell under a number of protected categories, such as: B. race, gender and age.

The Justice Department filed both its lawsuit and settlement against Meta on Tuesday. In its lawsuit, the agency said it had concluded that “Facebook could achieve its interests in maximizing its revenue and providing users with relevant ads through less discriminatory means.”

While the comparison is specific to housing ads, Meta also plans to apply its new system to verify the targeting of ads related to employment and credit. The company previously faced backlash for allowing prejudice against women in job ads and excluding certain groups of people from showing credit card ads.

The problem of biased ad targeting has been discussed particularly in the case of housing ads. In 2016, Facebook’s potential for ad discrimination was uncovered in a ProPublica investigation, which showed that the company’s technology made it easy for marketers to exclude certain ethnic groups for advertising purposes.

In 2018, Ben Carson, the secretary for the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having advertising systems that “unlawfully discriminate” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems didn’t serve ads to “a diverse audience,” even if an advertiser wanted the ad to be widely seen.

“Facebook discriminates against people based on who they are and where they live,” Mr. Carson said at the time. “Using a computer to restrict a person’s housing options can be as discriminatory as slamming a door in someone’s face.”

The Justice Department’s lawsuit and settlement are based in part on HUD’s 2019 investigation and discrimination allegations against Facebook.

In its own testing on the subject, the US Attorney’s Office for the Southern District of New York found that Meta’s advertising systems were driving housing ads away from certain groups of people, even when advertisers didn’t intend it to be. The ads were “disproportionately targeted toward white users and away from black users and vice versa,” according to the Justice Department’s complaint.

Many apartment ads in neighborhoods where most people were white also targeted primarily white users, while apartment ads in areas mostly populated by black people were shown primarily to black users, the complaint added. As a result, according to the complaint, Facebook’s algorithms would “actually and predictably reinforce or perpetuate racial segregation.”

In recent years, civil rights groups have also taken a stand against the vast and complicated advertising systems that underlie some of the largest Internet platforms. The groups have argued that these systems contain inherent biases and that tech companies like Meta, Google and others should do more to ward off these biases.

The area of ​​study known as “algorithmic fairness” is an important topic for computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have been raising alarm bells about such prejudice for years.

In the years since, Facebook has narrowed the types of categories marketers could choose from when buying housing ads, reducing the number to hundreds, and eliminating options for targeting by race, age, and zip code.

Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it is “essential” that “fair housing laws are aggressively enforced.”

“Housing ads had become tools for unlawful conduct, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea that they were either being targeted or denied housing ads based on their race and other characteristics.”

Meta’s new ad technology, which is still in development, will occasionally review who is showing ads for housing, employment, and loans, and make sure these audiences match the people marketers want to target. Theoretically, if the ads being served start to focus heavily on white males in their 20s, for example, the new system will recognize this and more fairly distribute the ads to be served to a broader and more diverse audience.

“We will occasionally take a snapshot of marketers’ audiences, see who they’re targeting, and remove as much variance from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and associate general counsel,” he said in an interview. He called it “a significant technological advance for using machine learning to deliver personalized ads.”

Meta said it will work with HUD in the coming months to integrate the technology into Meta’s ad-targeting systems and has agreed to have a third-party test the effectiveness of the new system.

The company also said it would no longer use a feature called “Special Ad Audiences,” a tool it developed to help advertisers expand the groups of people their ads would reach. The Justice Department said the tool was also implicated in discriminatory practices. Meta said the tool was an early attempt to combat prejudice and that its new methods were more effective.

The $115,054 penalty that Meta agreed to pay in settlement is the maximum available under the Fair Housing Act, the Justice Department said.

“The public should know that the recent abuse by Facebook was worth the same amount of money that Meta makes in about 20 seconds,” said Jason Kint, executive director of Digital Content Next, an association for premium publishers.

As part of the settlement, Meta admitted no wrongdoing.

Leave a Comment