Over the past two decades, the eCommerce Directive has set the rules of the road for digital platforms and services. However, concerns about issues such as hate speech and disinformation have recently spurred calls for the European Commission to rethink these policies. The new Commission’s policy agenda includes drafting the Digital Services Act (DSA) which aims to update the safety and liability rules for digital platforms. As EU policymakers consider new policies that set the basic rules for Internet firms, it is important to understand the repercussions such changes could have on the EU economy.
The Center for Data Innovation held an event at Brussels’ Press Club on November 20th, 2019 to discuss how the EU should modernize the eCommerce Directive so as to protect consumers without unnecessarily hindering innovation.
Prabhat Agarwal, Deputy Head of Unit of Online Platforms and eCommerce at DG Connect for the European Commission, described how, as a new initiative, the DSA will aim to address whether the eCommerce Directive has achieved its initial objectives. These objectives included creating and fostering an emerging single market, and providing a legal framework that fuels the fight against illegal content online while protecting freedom of expression. The Commission’s preparatory work will also assess whether those objectives are still valid today, and which new challenges should be considered.
Bruno Basalisco, Managing Economist at Copenhagen Economics, underlined that the DSA should play a critical role in supporting the completion of the digital single market, which is intended to strengthen the European economy, increase productivity, and better connect consumers and firms. Basalisco stressed the importance of streamlining digital market regulations on a global scale to achieve transparency and accessibility for consumers, and to enable smaller companies to grow without having to navigate through a thicket of rules.
Digital platforms prioritize user safety by eliminating violent or hate speech—abuses that often occur on social media platforms. Ania Helseth, Public Policy Manager EU Affairs for Facebook, stressed that content must be regulated—which is done through a global and comprehensive set of strict community standards to protect users on a global scale. Large platforms such as Facebook establish and maintain these core principles to ensure user protection as the digital landscape evolves. Helseth underlined the importance of taking into account that platforms do want to act responsibly. This is an ongoing and unfinished journey, as actors will continue to try to circumvent these policies and tools. As platforms must make an internal distinction between community standards and legal rules, Facebook works with law enforcement and other partners to be as proactive and efficient as possible to prevent illegal content from appearing. In addition, platforms must take into account the broad diversity of European content policies as content may be illegal in some but not all jurisdictions. Facebook has created proactive tools such as a notification channel for users to be able to flag infringement of community standards.
Agarwal confirmed that the Commission’s impact assessment will examine disinformation, as one of today’s emerging challenges to freedom of expression which could not have been anticipated at the time the eCommerce Directive was discussed. However, it is too early to conclude whether the solution lies in the DSA or in another instrument. Helseth recalled that as Facebook and other platforms are working with the Commission on disinformation. The Commission is assessing the Code of Practice on Disinformation, and decisions on the next steps will follow. She notes that Facebook is already implementing various measures that go beyond the eCommerce Directive regime, and that platforms should not become arbiters of the truth.
Eline Chivot, Senior Policy Analyst at the Center for Data Innovation, recalled that during his hearing, Commissioner Breton declared that the limited liability clause of the eCommerce Directive would not be negatively impacted by the DSA, and that the prohibition of general monitoring obligations set in Article 15 would be maintained. As many Internet companies are hosting but not editing or altering content, maintaining various principles of the eCommerce Directive will be key.
Smaller businesses, including startups, also play an important role in this discussion of the DSA. They too must also monitor online content. Benedikt Blomeyer-Bartenstein, Manager EU Affairs at Allied For Startups, discussed the importance of policymakers considering the startup entrepreneur in this process, and referred to Allied For Startups’ position which includes a set of six principles. The European entrepreneur should benefit from any updates to the DSA. But this entrepreneur is navigating a fragmented landscape. Implementing proactive measures to moderate content should not expose platforms to greater liability. Therefore, there is a need to clarify some elements such as the distinction between active and passive hosting. According to Agarwal, the “good Samaritan” principle is a term that doesn’t capture well the objective to ensure that there is no legal barrier stopping companies from doing things right.
Agarwal mentioned the need for a flexible framework that fosters experimentation. To this end, the Commission is considering the use of regulatory sandboxing. Blomeyer added that ensuring there is an environment for startups to test new business models while being protected and compliant would be helpful for them to innovate. To reduce fragmentation, Blomeyer mentioned that sectoral approaches, rather than horizontal laws, are more desirable.
Chivot raised the issue of using automated filtering technologies as a tool to promote and ensure transparency and accountability. Platforms are currently using filters voluntarily, but although these tools are evolving, they remain imperfect.
Agarwal agreed with this paradox: If platforms are only held liable once they have knowledge and exert control, but at the same time are subjected to an obligation to check and monitor, then the existing protections would not work as platforms would automatically acquire knowledge. He mentioned the varying liability exemptions companies have in place, and that for any type of platform, big or small, moderating all third-party online content is not viable for digital business models: Indeed, the scale and proportions this content represents today poses many challenges in terms of regulation—hence the need for a legal regime that clarifies the boundaries of their responsibilities, while still ensuring the preservation of fundamental rights. It is essential to identify, through a broader societal discussion, the right division between public and private responsibilities to decide on that flow of information, the kinds of checks and balances required, and who should be accountable for these decisions.
Agarwal emphasized that there can be consensus on the need to use filters in some cases, such as with the Halle or Christchurch shootings. But in such cases, decisions have to be made quickly and across various platforms, which presents a significant challenge. Blomeyer recalled the need to maintain the prohibition of general monitoring obligations, as when it comes to filters, the technical feasibility and the costs associated with their use are an issue for companies such as startups, that may not have the time and the resources to face this.
In Agarwal’s view, the economic rationale of the eCommerce Directive cannot be distinguished from its societal function. Indeed, the rules involve regulating goods and services, but also speech and information, which are key to the functioning of society. Therefore, this will not be a purely economic intervention. Basalisco echoed this by mentioning the wider socio-economic significance of balancing costs and risks. A typical case is that of “type one errors,” i.e., overenforcement and undue takedown and filtering decisions to prevent content from being distributed, thereby—in economic terms—preventing supply from meeting demand. Another type of error is insufficient intervention.
According to Blomeyer, the biggest risk for the policy discussions about the DSA would be for those to include too many topics. Convoluted and messy policy debates, as seen with hate speech or disinformation, could lead to “Christmas tree, wish-list” legislations standing in the way of a healthy focus on opportunities. One difficulty that will arise in this process is that there are many definitions of what constitutes controversial, harmful, and illegal content, including speech. In addition, as policymakers may be targeting the larger, foreign tech giants, they risk ending up disproportionately affecting smaller players. To avoid such unintended consequences, Allied For Startups therefore encourages any discussion about the DSA to be grounded and evidence-based, through a strong consultation process.
Basalisco added that the main benefit of the process leading to the DSA will be the completion of the single market. It will be helpful to distinguish between existing and new principles and approaches—so as to avoid any further fragmentation and to enable scale. For Helseth, the DSA can represent a significant opportunity in the sense that it will create rules for platforms to promote a responsible behavior. A challenge would be that stakeholders may depart from or insert in this discussion various other issues, which should be addressed through other types of regulations. Agarwal noted the risk of non-intervention and of maintaining the status quo, while there are opportunities to act. This means that companies will have to adjust their processes, e.g., in terms of flagging content, responding to notices, and cooperating with regulators, and they will be held accountable in a more structured, formalistic way. But the benefits will involve a much more harmonized, updated, clearer set of rules to provide more clarity, protection, and empowerment to businesses and consumers.