Airbnb Shows How Private Sector Can Use Data to Fight Discrimination
In December 2015, researchers at Harvard University discovered a disturbing trend on the popular home-sharing service Airbnb: hosts were routinely discriminating against would-be guests on the basis of race. In response, the company brought in experts, including former Attorney General Eric Holder, to conduct a comprehensive review of its platform to identify opportunities to prevent discrimination. Airbnb recently released a report summarizing the results of this review and detailing a number of its commitments to use data to combat bias in the booking process. In doing so, Airbnb has become the poster child for how companies can use data as a powerful force to combat human bias and discrimination.
Despite common claims that algorithms and big data will cause discrimination, the Harvard report found it was human-made decisions, rather than data-driven algorithmic ones, that were at fault for the bias on Airbnb. In fact, it was the availability of data that allowed this discrimination to be identified. The study found that Airbnb users with “distinctively African-American names” have approximately a 16 percent lower acceptance rate for bookings than individuals with “distinctively White names.” The trend is present regardless of the race of the host, whether the host and guest share the property, and the listing price. The researchers conclude that specific features of Airbnb, such as allowing hosts to screen guests and providing them the names of potential guests, facilitates discrimination.
Airbnb has acknowledged the bias present on its platform, noting that “minorities struggle more than others to book a listing,” and has created a plan to tackle discrimination on its platform. Airbnb will update its terms of service to include new anti-discrimination language and adopt a stronger anti-discrimination policy. It will also make training materials about combating unconscious bias available to hosts and highlight hosts who have completed this training. Furthermore, Airbnb’s changes prevent hosts from offering certain bookings if they have already rejected a renter by claiming that date was unavailable. Finally, it will reduce opportunities for subconscious or overt bias to influence booking decisions. For example, it will reduce the prominence of users’ profile pictures to prevent hosts from discriminating against others based on their appearance. And it will promote the use of instant bookings, which automatically accepts guests who match based on preset preferences and qualifications, will diminish the effect of unconscious bias.
Moreover, Airbnb is planning to invest heavily in using data to combat discrimination. The company announced that it will hire a “permanent, full-time team of engineers, data scientists, researchers, and designers whose sole purpose is to advance belonging and inclusion and to root out bias.” The team will “perform tests”, “examine algorithms”, and “make ongoing adjustments to the technical underpinnings of Airbnb’s platform.” The goal is to use the same data-driven methods used to maximize revenue to minimize bias.
Airbnb is not the first company to use data to identify and combat bias. As the Federal Trade Commission (FTC) noted in a recent report, “many companies… are actively using big data to advance the interests of minorities and fight discrimination.” The Center for Data Innovation has also argued that data-driven methods, such as disparate impact analysis, are a valuable tool to combat bias and should be expanded to better protect consumers. In 2013, the Consumer Financial Protection Bureau (CFPB) used an algorithm that analyzed borrowing data from auto dealerships to determine that interest rates were set disproportionately high for over 235,000 minorities buying auto loans. The CFPB inquiry and its use of data led to the largest settlement from an auto loan investigation in U.S. history.
Individuals, companies, and government entities that rely on data should carefully consider how human bias can enter the equation and produce discriminatory outcomes. However, those who fear that a more data-driven world will necessarily be a more biased one should see Airbnb’s response as a reason for optimism. Rather than blindly leading society toward a future of increased discrimination, data, especially when coupled with smart policies and protections, is a powerful tool for identifying and eliminating the biases that permeate everyday life. As Airbnb implements its data-driven approach to combating discrimination, other companies and organizations should attempt to replicate Airbnb’s efforts and identify other ways they can use data to reduce the influence of bias in their own operations.
Image: Open Grid Scheduler.