Critics of ‘Sexist Algorithms’ Mistake Symptoms for Illness
This article originally appeared in Real Clear Technology.
A recent study published by a team of researchers at Carnegie Mellon University (CMU) created a wave of sensational headlines decrying “sexist algorithms” after they uncovered that an ad for high-paying executive coaching services was served to men nearly six times as often as women. Among others, The Washington Post reported, “Google’s algorithm shows prestigious job ads to men, but not to women,” while a local CBS affiliate in the San Francisco Bay Area asked, “Sexist Google Algorithm? Women Shown Fewer Ads for High-Paying Jobs, Study Finds.” But lost in all the media vilification was any sense of perspective on the real problem.
The CMU researchers investigated how inferences made about users’ demographic characteristics and behaviors affect the ads they see. They created a number of simulated user profiles typical of jobseekers, with the only difference being gender. Once the simulated profiles were established, they visited a third-party news website, The Times of India, to evaluate the ads served. Using a custom-made tool for extracting data about the online ads, the researchers found that men received the ad for high-paying executive coaching services 1,852 times, while women received the same ad only 318 times.
But targeted advertising is a feature, not a bug. Platforms like Google gives advertisers the flexibility to serve ads based on the inferred interests and demographics of users in order to optimize ad performance. This gives advertisers greater control over who receives what message and for good reason—to improve conversion rates. Feminine hygiene products are likely not of great interest to men in the same way that Viagra is less successfully marketed to women. Targeted marketing allows advertisers to maximize the impact of their ad dollars, while reducing the number of annoying and irrelevant ads consumers see.
There are many possible explanations for the skewed results, none of which depend on the underlying ad-serving algorithm being sexist. First, if competing ads specifically target female traffic, it may reduce the number of female users exposed to otherwise gender-neutral ads. Second, women are more likely to see a wider variety of ads if more advertisers want to target a female audience. Indeed this potential explanation is supported by looking at the frequency of the top five ads served on The Times of India. Women saw the same five ads 269 times compared to men’s 2,180 times, suggesting that female-targeted ads are distributed among more advertisers. Finally, consumers collectively decide which ads they see. If an ad is of more interest to men, then men will click that ad more often. The algorithm learns this behavior and serves the ad more often to a male audience because its purpose is to optimize ad performance.
Ultimately, the algorithm simply reflects real-world behaviors that highlight gender disparities in society. Blaming algorithms distracts from the real issue: discrimination continues to pervade the real world. It is not the algorithm that needs to be fixed, it is society. That does not mean advertisers should be absolved of responsibility. Advertisers should take steps to ensure they do not discriminate, especially when it comes to important issues like employment, housing, and financial services. When it comes to jobs, employers should take steps to recruit and retain a diverse workforce through steps such as encouraging members of underrepresented groups to apply and not perpetuating stereotypes in job advertisements.
Ironically, an effort to increase job opportunities for women may have contributed to the skewed results in this study. One of the top five ads for women was from Goodwill. In June 2013, Goodwill announced a partnership with Walmart’s Global Women’s Economic Empowerment Initiative to establish a 30-month-long job placement program, called Beyond Jobs, which specifically targets unemployed and underemployed women. It appears that Goodwill specifically targeted women in its recruitment strategy at the height of this study, showing that so-called “sexist” gender-based targeted advertising can actually be positive.
Finally, not only are algorithms not the culprit, they can actually help reduce human biases in the employment process. In hiring, analyzing facts against pre-determined criteria means applicants can be evaluated based on skill and not demographics. Furthermore, data-driven insights can help police undesirable behavior, like automatically identifying job advertisements that use gender-specific terminology, such as “waitress” instead of “wait staff,” or stereotypical images, such as a female nurse. Reformers can leverage data to raise awareness about gender inequality in society and to find effective means for addressing it. Collecting data about women’s representation and mobility in the workforce is crucial for employers and policymakers to isolate causes and take effective action to achieve gender parity in the workplace. In short, the media needs to refocus attention on the underlying causes of gender inequality in society and embrace algorithms as part of the solution.
Image: flickr user methodshop.com.