Claire Pershan is the Mozilla Foundation’s EU Advocacy Lead, based in Brussels, Belgium. Caroline Sinders is a critical designer and artist examining the intersections of artificial intelligence, abuse, and politics in digital conversational spaces.
Europe’s new rulebook to ensure safe, ethical and transparent online spaces, the Digital Services Act (DSA), finally entered into force on November 16th. Against the backdrop of Elon Musk’s acquisition of Twitter and Meta’s gutting of its transparency tool CrowdTangle, the DSA is the EU’s promise to reset a tech ecosystem that seems to have gone off the rails.
Unlike the EU’s privacy regulation, which left enforcement to member states, the DSA will see the Commission become a regulatory authority. It will be tasked with overseeing the compliance of “very large online platforms” and “very large online search engines”, defined as those with over 45 million users in the EU.
Now the European Commission is looking to staff its new Platforms Directorate and its European Centre for Algorithmic Transparency with legal officers, policy officers, economists, communications and community management professionals, and cybersecurity experts. Given that they will need to look ‘under the hood’ of the companies and assess compliance with obligations related to content moderation and algorithmic recommendation, they’re calling for researchers and PhDs from a variety of fields, with expertise in computer science, data science, artificial intelligence, social sciences, and engineering.
Will this be enough? The timing is poignant; the Commission is building up its enforcement team just as the companies it will oversee are letting go of hundreds of employees charged with trust and safety, privacy, and other compliance duties. Some are suggesting that the Commission hire individuals with relevant skills that were recently let go from the companies’ ranks. With postings in Seville, Spain and Ispra, Italy, the Commission just might be able to entice a few European techies to join the public sector. But the Commission should also have its sights on design researchers with expertise in user experience and user interface design (UX/UI, in Silicon Valley speak). These skills are needed to spot deceptive design practices, which can be used to circumvent regulatory obligations.
Designers and product managers are technologists as well as socio-technical experts. Many employees who hold such roles in big tech companies and startups don’t have PhDs, but what they lack in credentials, they make up for in real world, on the ground experience, especially with AI, algorithms and machine learning and how users interact with these systems. This is something our co-author has seen up close working as a design researcher at IBM Watson on AI products and with Google’s PAIR (the People and Artificial Intelligence Research group). Design is a creative process as well as a translation process. Taking AI systems or algorithms and creating products around them requires understanding how people use software and how that software is actually embedded within processes that, ideally, feel seamless and intuitive.
Legislators like to say that ‘the devil is in the details’. As design experts know, the devil is in the user experience. Tech policy is not just about rules; it must involve design considerations to render regulation or transparency requirements useful and actionable. Nowhere is this more clear than with deceptive design patterns, manipulative online architectures that steer people toward potentially harmful decisions and can be leveraged as a tactic for regulatory circumvention. The numbingly irritating cookie pop-ups that followed the implementation of Europe’s General Data Protection Regulation (GDPR) frequently deploy deceptive design techniques to permit companies to continue business as usual under the veneer of compliance.
Much of the power of the DSA is contingent on its ability to influence platform design. It’s one thing to mandate that companies create an alternate recommendation system¹ or content appeals mechanism, for example, but it’s another thing to ensure they present these options to users in a fair, honest, or “user-friendly” way. (Indeed, the qualifier “user-friendly” appears in the DSA 9 times.)
We’ve seen deceptive design patterns sabotage regulation before, with the bumpy roll out of GDPR consent banners or the buried reporting tools required by the German Network Enforcement Act (NetzDG).²
Design is integral in the process of making both code and policy understandable to all people; it can radically impact the translation of tech policy into software and systems. As it prepares to enforce what may well be the most ambitious regulatory regime for online platforms anywhere in the world, the European regulator needs design expertise in house. Without it, the entire approach may be for naught.
– – –
¹ In fact, Facebook already has a chronological timeline option, it’s just not prominently shown.
² In 2018, Germany passed the Network Enforcement Act or NetzDG, which allows users to report illegal content and specifically online hate speech. However, the interface to report hate speech, which was supposed to be “user-friendly” was in some cases hidden and confusing for users. In addition to being initially hidden, the complaint form, once presented, warned users that false statements could be punishable by law, an exaggerated statement that may have had chilling effects. In 2019, Facebook was fined by Germany’s Federal Office of Justice for NetzDG violations including related to their reporting form.