Wikipedia:Wikipedia Signpost/2022-05-29/News from the WMF
The EU Digital Services Act: What’s the Deal with the Deal?
Policymakers in the European Union (EU) have finally completed their negotiations over the Digital Services Act (DSA), a regulation that aims to address the spread of illegal content online. Now they have largely agreed on the rules that will govern online content moderation. Some technicalities still have to be ironed out, but the cornerstones of the regulation are known.
The Wikimedia Foundation has been tracking the developments of the DSA since the consultation phase and before the European Commission introduced the draft proposal. We have always supported the core aim of the DSA: to make content moderation more accountable and transparent. At the same time, we have cautioned that designing regulatory structures that only fit the operating models of big, for-profit websites could have devastating consequences for not-for-profit websites like Wikipedia. The legislation will fundamentally shape how online platforms operate in Europe, and also have an impact on the rest of the world online. It is also an opportunity to protect the community-governed, public interest internet, as we asked policymakers to do through four essential measures:
- Rules that address the algorithmic systems and business models that drive the harms caused by illegal content.
- Requirements for transparent and equitable terms of service without overly-prescriptive rules on how they are created and enforced.
- Rules on the processes for identifying and removing “illegal content” must allow user communities to participate.
- Rules that do not force platforms to substitute the work of people with algorithms when it comes to moderating content.
While the DSA, to a certain degree, distinguishes between centralized platforms and those that are community-governed, some concerns remain. Here is how the final outcomes stack up to our requests.
1. Does the DSA address the business models and algorithmic systems that drive the harms caused by illegal content?
The DSA framework was largely designed to address the rapid and rampant spread of illegal content online by defining some of the processes through which online content is removed. It is our position that regulations need to target the causes, not the symptoms, of the spread of illegal content: i.e., the targeted advertising business model and algorithms driving profit for ad placement. Focusing on these aspects both gets at the root problem of the issue and avoids regulating non-for-profit websites like Wikipedia as if they were operated according to the same model.
The outcomes of trilogue negotiations fall short of what we desired, but are still worth celebrating. The text bans targeted advertising that is based on sensitive information such as political, sexual or religious preferences. The EU is also banning “dark patterns”: i.e., deceptive design tactics that trick users into accepting, rather than refusing, tracking options. The DSA mandates that options to reject and accept tracking must be equally easy to select.
2. Does the DSA leave room for community-developed policies for content?
The information on Wikipedia and other Wikimedia projects is produced, maintained, and debated according to volunteer-developed rules, which are fully transparent. This volunteer-led model of content moderation has its imperfections, but it has also helped make Wikipedia a global source of neutral and verifiable information. To protect this community-governed internet, laws should not require platforms to enforce rules that are set by their users or oblige service providers to engage in top-down content moderation.
We are pleased to see that the DSA will focus only on the rules set by the service providers and their moderation obligations, leaving room for Wikimedia communities to develop and refine their own policies for content and conduct as well as to enforce them. The DSA will not prevent volunteer editors from taking care of our public interest information ecosystem.
3. To what extent does the DSA recognize user communities as part of the process for identifying and removing “illegal content”?
It is not enough for regulations like the DSA to just permit community-based content moderation: the law should explicitly promote that people, as members of our information society, play a more significant role in creating digital public spaces. While we applaud EU policymakers for recognizing that the rules of the DSA must not be articulated with only one type of platform in mind, we would have hoped for rules around the process for removal of illegal content that explicitly support community-governed content moderation systems. Even so, the regulation’s new notice-and-action regime has been vastly improved when compared to the original proposal, which could have led to Wikimedia getting constantly “trolled” by bad-faith actors. First, the service provider can determine whether or not to take action after a notice has been submitted. This is particularly important because so many of the notices that the Wikimedia Foundation receives are not about content that is actually illegal. Second, service providers retain the right to ignore notices from individuals or organizations if they consistently provide false or misleading information.
We are, however, concerned about the “crisis mechanism”, because it allows the European Commission to mandate that Very Large Online Platforms (VLOPs) tackle certain content that contributes to a threat to public health or safety. Through this mechanism — in the case that Wikimedia projects such as Wikipedia are determined to be VLOPs — the DSA essentially gives the Commission the executive power to override content moderation decisions by the Wikimedia communities. The safeguards, added after civil society organizations have voiced grave concerns, limit the potential for abuse to a certain degree — for instance, through a sunset clause and a high standard for transparency about the Commission’s demands to platforms.
4. Does the DSA enable human oversight of content moderation?
Wikimedia’s open, volunteer editing model is grounded in the belief that people should decide what information is included in the projects, and how it is presented . . . not machines or automated systems. Although the latest version of the DSA does not explicitly rule out automated tools, we find it encouraging that their use is neither explicitly mandated nor done so de facto through very short removal deadlines. The explicit prohibition of general monitoring obligations further alleviates a persistent concern we have had: i.e., that short removal timeframes and the threat of being held liable for user-uploaded information would compel service providers to deploy algorithmic tools in order to swiftly identify and remove any and all allegedly illegal content. What comes next for the DSA?
We are looking forward to seeing the complete text, where any outstanding details have been clarified. The Parliament will vote on the consolidated version in July 2022, and once the regulation is published in the Official Journal of the European Union — the official EU gazette of record — it will come into force 20 days later. Online platforms and websites will have 15 months after that date to prepare for when the rules start to apply.
Once it becomes law, the DSA will shape communication and life online for everyone in Europe, and most likely for everyone around the world as well. The Wikimedia communities have always emphasized transparency about how their projects are built and developed. Now the DSA will make content moderation processes on many other platforms more transparent and predictable as well, which will also benefit Wikimedia editors and readers. Free knowledge as a whole and the Wikimedia projects in particular are an important part of people’s online experience. For that reason, we will continue to advocate public policy that protects and promotes them, and that empowers everyone to participate in the kind of decentralized decision making and content moderation that makes Wikipedia so successful and popular in Europe and the rest of the world.
Discuss this story