In Depth Airbnb

Published on September 20th, 2016 | by Andrew Kitchel

1

Airbnb Shows How Private Sector Can Use Data to Fight Discrimination

In December 2015, researchers at Harvard University discovered a disturbing trend on the popular home-sharing service Airbnb: hosts were routinely discriminating against would-be guests on the basis of race. In response, the company brought in experts, including former Attorney General Eric Holder, to conduct a comprehensive review of its platform to identify opportunities to prevent discrimination. Airbnb recently released a report summarizing the results of this review and detailing a number of its commitments to use data to combat bias in the booking process. In doing so, Airbnb has become the poster child for how companies can use data as a powerful force to combat human bias and discrimination.

Despite common claims that algorithms and big data will cause discrimination, the Harvard report found it was human-made decisions, rather than data-driven algorithmic ones, that were at fault for the bias on Airbnb. In fact, it was the availability of data that allowed this discrimination to be identified. The study found that Airbnb users with “distinctively African-American names” have approximately a 16 percent lower acceptance rate for bookings than individuals with “distinctively White names.” The trend is present regardless of the race of the host, whether the host and guest share the property, and the listing price. The researchers conclude that specific features of Airbnb, such as allowing hosts to screen guests and providing them the names of potential guests, facilitates discrimination.

Airbnb has acknowledged the bias present on its platform, noting that “minorities struggle more than others to book a listing,” and has created a plan to tackle discrimination on its platform. Airbnb will update its terms of service to include new anti-discrimination language and adopt a stronger anti-discrimination policy. It will also make training materials about combating unconscious bias available to hosts and highlight hosts who have completed this training. Furthermore, Airbnb’s changes prevent hosts from offering certain bookings if they have already rejected a renter by claiming that date was unavailable. Finally, it will reduce opportunities for subconscious or overt bias to influence booking decisions. For example, it will reduce the prominence of users’ profile pictures to prevent hosts from discriminating against others based on their appearance. And it will promote the use of instant bookings, which automatically accepts guests who match based on preset preferences and qualifications, will diminish the effect of unconscious bias.

Moreover, Airbnb is planning to invest heavily in using data to combat discrimination. The company announced that it will hire a “permanent, full-time team of engineers, data scientists, researchers, and designers whose sole purpose is to advance belonging and inclusion and to root out bias.” The team will “perform tests”, “examine algorithms”, and “make ongoing adjustments to the technical underpinnings of Airbnb’s platform.” The goal is to use the same data-driven methods used to maximize revenue to minimize bias.

Airbnb is not the first company to use data to identify and combat bias. As the Federal Trade Commission (FTC) noted in a recent report, “many companies… are actively using big data to advance the interests of minorities and fight discrimination.” The Center for Data Innovation has also argued that data-driven methods, such as disparate impact analysis, are a valuable tool to combat bias and should be expanded to better protect consumers. In 2013, the Consumer Financial Protection Bureau (CFPB) used an algorithm that analyzed borrowing data from auto dealerships to determine that interest rates were set disproportionately high for over 235,000 minorities buying auto loans. The CFPB inquiry and its use of data led to the largest settlement from an auto loan investigation in U.S. history.

Individuals, companies, and government entities that rely on data should carefully consider how human bias can enter the equation and produce discriminatory outcomes. However, those who fear that a more data-driven world will necessarily be a more biased one should see Airbnb’s response as a reason for optimism. Rather than blindly leading society toward a future of increased discrimination, data, especially when coupled with smart policies and protections, is a powerful tool for identifying and eliminating the biases that permeate everyday life. As Airbnb implements its data-driven approach to combating discrimination, other companies and organizations should attempt to replicate Airbnb’s efforts and identify other ways they can use data to reduce the influence of bias in their own operations.

Image: Open Grid Scheduler

Tags: , , , , ,


About the Author

Andrew Kitchel is a graduate policy fellow at the Center for Data Innovation. He is also a full-time Master of Public Policy student at the McCourt School of Public Policy at Georgetown University. Andrew is passionate about energy, science, and technology policy and the use of big data to drive evidence-based initiatives in these policy areas. His background and studies include data analysis, survey methodology, and political campaign work. Prior to coming to the Center for Data Innovation, he worked as a survey statistician at the U.S. Census Bureau. Andrew holds a B.S. in biology from the University of Puget Sound and a B.S. in political science from the University of Oregon.



  • Ben Edelman

    Much could be said about the effectiveness and likely effectiveness of Airbnb’s planned changes. I’m prepared to argue that you’re quite generous in the credit you give them, largely because I don’t think the planned changes will be as effective as you suggest.

    But I want to particularly alert readers that you appear to be mistaken when you write “For example, it will reduce the prominence of users’ profile pictures to prevent hosts from discriminating against others based on their appearance.” Do you have a citation for this claim? The actual commitment in Airbnb’s report is to “experiment with reducing the prominence of guest photos in the booking process.” Note the commitment only to “experiment,” i.e. to present smaller photos to some subset of hosts, for some period of time, basically an A-B test. There is no commitment to conduct an experiment of any particular scope or duration, to analyze the data in any particular way, or to make any particular change (or indeed any change at all) at the end of the experiment. So “it will reduce” significantly overstates what Airbnb actually said it will do to the pictures. Maybe Airbnb will in fact reduce the picture prominence for every user, permanently. Maybe it will increase the prominence! The report doesn’t say.

    My response to the report is here http://www.benedelman.org/news/091916-1.html . I discuss this specific issue in the “Airbnb’s bottom line” section, point 2.

Back to Top ↑

Show Buttons
Hide Buttons