Meta’s new ad policy further protects teen privacy and tackles discrimination

>

Meta is updating the way it delivers ads to users to promote a more positive experience. The changes can be divided into two parts: one restricts how companies target teen users, while the other aims to make things more “just” and less discriminatory.

It seems that Meta is forcing companies to generalize more with their advertising rather than targeting a specific group. From February, advertisers will do so no longer able to target teens (opens in new tab) based on their gender on Facebook or Instagram. Advertisers can only use a user’s age and location as metrics. This tightening of the rules follows a similar one updated from 2021 that restricted advertisers from targeting underage users based on their interests and activities in other apps.

And in March, teens will get more tools in Ad Topic Controls to “manage the types of ads they see on Facebook and Instagram.” It doesn’t look like they’ll be able to stop seeing ads at all. Like it or not, teenagers will continue to encounter them. But at the very least, they can go to Ad Preferences in both apps and choose Show less to minimize the number of commercials seen.

(Image credit: Future)

Fighting discrimination

The second update (opens in new tab) will focus on Meta’s new Variance Reduction System (VRS) to create a more “fair distribution of ads” across its platforms; namely those related to housing, employment and credit in the US.

VRS comes after the company filed a lawsuit with the United States Department of Justice (DOJ) (opens in new tab) over allegations that it “engaged in discriminatory advertising that violated the Fair Housing Act (FHA).” Apparently, Meta would allow advertisers not to show ads to certain groups of people “based on their race, color, religion, and gender,” among other metrics.

The technology behind VRS is said to use a new form of machine learning to show ads that “better resonate with the eligible audience”. According to meta (opens in new tab), the system works by sending housing ads to a wide variety of people first. From there, it will measure the “aggregate age, gender, and approximate race/ethnicity distribution” of the people who came across the ad.

VRS will then compare its findings to the measurements of people who are “more generally eligible to see the ad”. If it detects discrepancies, the system will adjust itself to be fairer so people aren’t left out.

Privacy in mind

Privacy is of great importance to VRS. The measurements made by the system include “differential privacy noise” to prevent the system from learning and retaining that information so it doesn’t respond to specific information. It also has no access to people’s actual age, gender, or race, as the data are all estimates.

The DOJ seems pretty happy with these changes. U.S. Attorney Damian Willaims said the DOJ appreciates Meta working with the government and “taking the first steps to address algorithmic bias.”

Currently, VRS only works with housing ads in the United States, but there are plans to expand to both employment and credit ads later this year. We’ve asked the company if all of these changes are US-exclusive or if they’re rolling out globally. This story will be updated if we hear anything.

While it’s a good thing Meta is improving its advertising policies, some of us prefer not to have one. If you are one of those people, you should definitely check it out TechRadar’s list of the best ad blockers for 2023.

Related Post