It came as no surprise that a Federal Trade Commissioner recently stressed the need for big data companies to make sure they avoid harming consumers. In her keynote at a September 2015 event sponsored by Google in Washington, D.C., focusing on how data can drive social benefits, Federal Trade Commissioner Terrell McSweeny cautioned companies to evaluate the potential for their increased reliance on data to have unintended consequences, such as with algorithms that propagate discrimination or unfair biases. But while Commissioner McSweeny is right to call on data users to carefully consider the potential discriminatory or inequitable outcomes of data use, she neglected to mention how these new technologies also require new thinking from regulators. McSweeny recognized that data can offer both huge social benefits and unintended consequences. Just as companies should work to avoid these consequences, regulators should avoid regulatory approaches that could restrict beneficial applications of data.
Given that the potential social benefits of data are so important, McSweeny stressed the need to ensure that these benefits are enjoyed equitably. Minorities, the poor, the elderly, and new immigrants are routinely undercounted in government data, and as a result, decisions based on this faulty data will exclude these data-poor communities from the benefits data has to offer. McSweeny illustrated this problem with an app used by the city of Boston called Street Bump, which relies on residents to use sensors on their smartphones to record when they drive over potholes or other road hazards. While this can be an efficient method for the city to identify and repair damaged roads, this data is skewed to over-represent wealthier neighborhoods where people are more likely to own cars and smartphones, whereas poor neighborhoods have fewer smartphone owners and residents are more likely take public transportation.
Because of this risk, McSweeny urged that companies and governments should not view data technologies such as algorithms as inherently neutral, but as tools that could serve to perpetuate existing biases and exacerbate discrimination. Ultimately McSweeny calls for the private sector to adopt an approach of “responsibility by design” —an approach “that values nondiscrimination and fairness throughout the lifecycle of products,” and takes “meaningful steps to assess and mitigate harmful or discriminatory consequences if they are revealed.”
The private sector should act responsibly by assessing potential harmful consequences of data use, but by the same token, regulators should also ensure they carefully consider the potentially harmful consequences of well-intentioned rules designed to protect consumers. One of the examples McSweeny gave of how algorithms could facilitate social harms was a Northeastern University study that revealed consumers sometimes see different prices online based on their device or other consumer data. However, this is an example of how big data can lead to more granular differential pricing, which a White House report revealed to actually benefit both consumers and the private sector. Should regulators ban such practices because they feel they are discriminatory, it would prevent companies from offering lower prices to populations traditionally unable to access certain markets. And a Federal Trade Commission report on the Internet of Things earlier this year expressed support for data minimization—the practice of limiting data collection and retention—out of consideration for consumer privacy. However, such narrow restrictions on how companies can use data from the Internet of Things necessarily limits their ability to develop new services or derive insights that could create important social benefits.
It is nonetheless encouraging to see a Federal Trade Commissioner so enthusiastic about the social benefits big data has to offer. Hopefully going forward, regulators will translate this enthusiasm into more forward looking regulatory approaches that maximize the benefits of big data, just as the private sector develops strategies to avoid unintended discriminatory applications of data.