When it comes to protecting people’s privacy, there is such a thing as being too careful. The latest case in point comes in a complaint that Norway’s consumer watchdog has leveled against makers of the popular fitness trackers Fitbit, Jawbone, Garmin, and Mio. The complaint alleges that the privacy policies for these devices are too open-ended and that there should be tighter limitations on what firms can do with personal data. However, this is both unnecessary and unwise, because it would preclude innovative applications that could benefit people in ways not yet imagined.
Consider how Jawbone was able to provide a public service in the aftermath of a sizeable earthquake two summers ago in Northern California. Seismographs said the tremor, which occurred at 3:20 a.m. near the city of Napa, was a 6.0 on the Richter Scale—but Jawbone was able to quantify its intensity in more human terms by showing, that it interrupted people’s sleep as far away as Sacramento and San Jose. This is useful information for seismologists and government officials. Yet it is just the sort of thing that will not be allowed in the EU (or Norway, which is subject to EU law as a member of the European Economic Area) when the new General Data Protection Regulation (GDPR) comes into full effect in 2018, because the GDPR requires permissible uses of data to be defined in advance, and who would have anticipated such a use?
To outlaw all uses of data other than those for which it was first collected assumes, falsely, that any reuse would have to be, ipso facto, a breach of privacy. Yet the most valuable aspect of the Internet of Things, of which fitness trackers are a part, is not the personal benefits that devices deliver to their individual owners, but the social benefits of re-purposing the vast amounts of accumulated data—data that can easily be anonymized, as Jawbone did in the above example. The data from one fitness tracker may help its owner to stay healthy, but the aggregated data from all fitness trackers reveals a great deal about how people eat, sleep, and exercise in general, which could inform public health policy and help make everyone healthier, even if they have never heard of the Internet of Things.
The complaint by the Norwegian Consumer Council (NCC), about fitness trackers is not only misguided in this instance: It is part of a campaign to prowl for even the most trivial violation of data protection rules. In March 2016, the NCC complained that Tinder’s privacy agreement made it too easy for the firm to delete customer accounts. Given the prevalence of misogyny and abuse that occurs on dating apps, it seems perfectly reasonable that Tinder should be able to ban people who misuse the service and odd that a consumer watchdog would take steps to limit this practice. In May 2014, NCC complained that the terms of service for Apple’s iCloud did not promise to give sufficient notice of changes. None of these complaints identified any actual harm done to consumers.
The problem with pursuing these trivial violations is that they encourage companies to focus more on protecting themselves from regulators than complying with the spirit of the regulations. Companies that are forced to spend more on lawyers to ensure they do not run afoul of minor legal technicalities have less to spend on innovations, like product enhancements and security features, that would actually benefit consumers. So rather than helping consumers, watchdog groups like the NCC just create more problems.
Moreover, groups like the NCC demonize rational responses to bad regulations. Since European law requires privacy policies to state up front everything a company might possibly do with customer data, firms make their policies as permissive as possible, lest they inadvertently put themselves in breach of their own terms. Policies thus tend to permit things that companies have no real intention of doing. The end result is turgid and complex privacy policies that are more about ensuring legal cover for companies than providing transparency to customers. Privacy policies are supposed to make clear how companies are actually using personal data, but the regulations undermine this goal.
Under Europe’s data protection rules, privacy policies will continue to get even longer and more complex—thereby raising costs for companies and becoming even more opaque to consumers—as the volume and variety of data collected by companies grows. In the short term, regulators should show restraint in their enforcement. In the long-term, Europe needs to change its privacy laws to focus on abuses of data that really are harmful, instead of prescriptive rules that outlaw even anonymous re-use carte blanche.
Norway should serve as a warning for other countries that unnecessarily rigid enforcement of the rules does a disservice to consumers. Rather than nitpicking details that only signal the perverse incentives overregulation creates, Europe’s regulators should focus on punishing activities that actually harm consumers. Doing so will help highlight the distinction between real and hypothetical threats to privacy, and could supply practical examples to inform more nuanced regulations in the future.
Image credit: Jawbone