Facebook agrees to halt discrimination based on 3 key factors

Just imagine, up until now, blacks and Asians, Muslims and Southern Baptists, homosexuals or “menstruating people” were being targeted for discrimination by the woke Facebook corporation.

That’s evident because the Department of Justice has just announced a “groundbreaking settlement agreement” with Facebook’s parent company, Meta Platforms, to “resolve allegations of discriminatory advertising.”

The dispute involved the DOJ’s first lawsuit challenging “algorithmic discrimination” under the Fair Housing Act.

The agreement has Facebook agreeing to change its ad delivery procedures.

The agreement so far is proposed; it still must be approved a the U.S. District Court for the Southern District of New York.

The case alleged “Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin.”

The complaint charged that Meta uses algorithms in determining “which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the FHA.”

Facebook is agreeing to stop using its tool for housing ads, the “Special Ad Audience,” and will create a new system to address racial and other disparities, a system that the DOJ will approve and watch.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division.

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”

Damian Williams, U.S. attorney for the Southern District of New York, pointed out that new tech, including algorithms, can discriminate based on “protected characteristics” just like more traditional ad methods.

“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, an official at the Department of Housing and Urban Development. “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”

The problem was, according to the DOJ, that Meta “enabled and encouraged” advertisers to target their housing ads based on race, color, religion and more.

It had set up a tool known as the “Lookalike Audience” that used machine-learning to find Facebook users who share similarities.

Under the agreement, Meta must halt any use of those advertising tools for housing ads for any “Lookalike Audience,” develop a new advertising plan that doesn’t use race, ethnicity and other factors, and more.

Meta also must pay what amounts to pocket change for the corporation, $115,054, for its actions.

Content created by the WND News Center is available for re-publication without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact [email protected].


This article was originally published by the WND News Center.

Related Posts