Is advertising algorithm racist? Facebook is back on the show
If you areWomen.you're more likely to receiveSupermarket revenue clerkjob advertisements.
If you areBlack.you're more likely to receiveTaxi driverjob advertisements.
Author: Samantha Cole
Translation and original text: MarTechCareer
Recently, the U.S. Department of Housing and Urban Development (HUD) got into trouble with Facebook, which accused it of racial discrimination in its ad targeting algorithm. For years, advertisers have used Facebook to target (or avoid targeting) protected groups, such as minorities and people with specific gender identity. But in a new paper, a group of researchers claim thatFacebook's advertising algorithms are biased.
The paper was w presented by Northwestern University, Nandal, and nonprofitsUpturnthe researchers jointly published. The paper claims that Facebook unfairly willHome purchaseAnd.Job adsMorePush.Give it to a specific group of people. For example, job advertisements for certain occupations are pushed more oftenWhite MenAnd job ads for other specific occupations are pushed moreBlack woman。 Even if the advertiser is setting the targetNo, it's notSuch a choice, even when they choose to push to a "wide range of people," can still occur, and it occurs in Facebook's advertising algorithms.
"The bias of Facebook's advertising algorithms is actually internal to Facebook," the paper saysPredictions about the relevance of ads to peopleRelatedly, especially in buy-to-buy and job ads, Facebook's ad-pushing algorithm still has a serious gender and racial bias, even though advertisers have chosen neutrality in setting targeting parameters. "
Late last month, HUD accused Facebook of violating the Fair Housing Act by targeting ads based on "race, color, religion, gender, family status, nationality and disability." This is contrary to the Fair Housing Act.
So how exactly does this study prove this bias?
If advertisers don't intentionally choose to target a particular group of people, will Facebook still make targeted choices for them? To test this, the researchers conducted dozens of paid advertising campaigns on Facebook, including hundreds of ads that reached millions of people and cost $8,500.
They found that several different factors, including the ad budget, the image in the ad, and the title of the ad, affected the type of people the ad was pushed to by Facebook's algorithm.
In the experiment, a group of ads without images or titles were used as "benchmark" ads, compared to other ads with various images or links when they were running. By controlling variables, you find out which element of your ad affects the type of people your final ad will run to.
For example, when linking to a page like bodybuilding.com (with the majority of male users) or elle.com (with the majority of female users), ad pushes are more gender neutral (nearly half). So the link itself doesn't bias the algorithm.
But when they add images to their ads - like one in abod for buildingA man who lifts weightsOr add an ad image on elle.comMake-up brushSo75%The above people who received the bodybuilding ad areMenand, as for itWomen.Occupy the crowd that receives the elle.com ad90%Above.
In addition to gender, there are other demographic dimensions that have such biases, such as occupation.
The researchers did"Wood Industry"Five job ads were designed and set up to send ads to a "wide range of people." But ads are still delivered to more than 90%.Menand more than 70%.White.User. And"Administrator"job ads were delivered to 65%.Women.and 75%.Black.User. Similarly,Among those received“Supermarket revenue clerks"Of the job ads, 85% areWomen.and received“Taxi drivers"In the crowd of job advertisements,Black.Up to 75%.
The study also found that receivedHouse saleAmong the people who advertise,White.Users make up 75 per cent of the audience, but the audience composition of rental ads is relatively average.
The findings support some of HUD's allegations against Facebook. HUD's court filing reads:
"Facebook not only allows advertisers to target people based on race, gender, and other protected characteristics, but its advertising algorithm itself is biased: even if advertisers choose to target a wide range of people when setting up ads, the composition of the people who actually receive them is not diverse and average, but systematically biased."Even if advertisers want to avoid such bias, Facebook's advertising algorithm will still predict the results based on internal models and push ads to people they think are most likely to interact with them. "
Facebook spokesman Joe Osborne said in a statement:
"We oppose any form of discrimination. We've announced important rule changes to our ad targeting tools, and we know this is only the first step. We've been observing our ad delivery system, and we've assembled research groups on this topic, including industry experts, academics, and civil rights experts, and we're going to explore better ways. "
The researchers' original intention was to use the study to call on policymakers and platforms to be more cautious about advertising platformsOptimize the algorithmNot only do advertisers' targeted choices need to avoid bias, but platform algorithms also need to avoid bias and discrimination in digital advertising.
The above report does not represent this view. Here's MarTechCareer's review of the incident:
In the settings of Facebook's ad campaign, advertisers can choose a "targeting strategy", and if the goal of the ad campaign is to target ad interactions and user involvement, Facebook will make specific pushes to accomplish the campaign's goals, even if specific preferences are not made on the target audience. The algorithm itself is not biased, it simply makes it more likely that the ad will be pushed to the target of the campaign. And the goal itself is that people choose it themselves.
To say that the algorithm is biased is actually technology for the strategy back pot.
Go to "Discovery" - "Take a look" browse "Friends are watching"
sent to have a look