In Brazil, Platform Regulation Takes Center Stage
Beatriz Kira / Apr 24, 2023Beatriz Kira is a Postdoctoral Research Fellow in Law & Regulation at the Department of Political Science - School of Public Policy of University College London (UCL), where she is part of the Online Speech Project.
Recently, I participated in a panel reflecting on Brazil's President Lula's first 100 days in office. This symbolic milestone provided an opportunity to assess the new government's performance, and to discuss the challenges that lie ahead as it seeks to implement its agenda, which includes platform regulation.
In Brazil, the issue of platform liability and content moderation has become a pressing concern after the storming of the Brazilian capital by far-right supporters of former President Jair Bolsonaro on January 8, 2023. As a result of the violence that day, the Brazilian government has actively pursued measures to enhance the responsibility of intermediaries in moderating online content. The new administration had already indicated that digital governance was likely to be a priority. The appointments of the new Special Advisor for Digital Rights (Assessoria Especial de Direitos Digitais) within the Ministry of Justice and of a Secretary for Digital Policy (Secretaria de Serviços e Direitos Digitais) within the Ministry of Communications were evidence of that.
But in the aftermath of the January 8 attacks on Brazil’s democratic institutions – which were immediately compared to the events of January 6, 2021 in the US – Lula’s government became convinced that the brutality of the events was assisted by the circulation of online content. This content was allegedly produced and disseminated by extremist groups in the days leading up to the attack. There were also allegations that platforms did little to prevent the dissemination of this content and adopted only minimum measures to deal with the systemic risks generated by harmful posts.
But it wasn’t just the attacks on democratic institutions that highlighted the need for platform regulation in the eyes of the government. Brazil has recently experienced tragic violent attacks on schools, making children and teachers victims – something new and shocking in the Brazilian context. These attacks also raised concerns about how online radicalization can fuel real-life violence, and the role of internet intermediaries in moderating harmful content. The circulation of harmful content allegedly continued for days after the attacks, with shocking images of the perpetrators and victims being shared on social media.
As a result, the early days of the Lula government saw a growing consensus among officials that online intermediaries have a significant role in shaping public discourse, and that immediate action is necessary to enhance platform’s responsibility for online content. In this context, members of the Lula administration have been quick to require platforms to take a more active role in moderating harmful content, and to call for urgent policy and regulatory measures to legally require platforms to do so.
Platform regulation and intermediary liability in the early days of Lula’s government
The Brazilian government has been actively pursuing measures to enhance the responsibility of intermediaries in moderating harmful online content on multiple fronts. The Federal Supreme Court held a public hearing on two cases regarding the constitutionality of Article 19 of the Marco Civil da Internet (the Brazilian Internet Civil Rights Framework), which establishes that platforms can only be held liable for content after receiving a court order instructing them to remove it (with the exception of copyrighted material and non-consensual intimate images, which are subject to a notice-and-takedown mechanism). The public hearing included representatives from social media platforms, civil society organizations, sectoral regulators, and government officials. The contributions from government representatives shed light on the Lula administration’s stance on platform regulation. The Special Advisor for Digital Rights argued that the liability safe harbor was created under the assumption of platform neutrality, which no longer applies because platforms are now mediators that shape online interactions. Meanwhile, the Secretary for Digital Policy claimed that the current model “authorizes omission by digital platforms” and that platforms should not be completely exempt from preventing the spread of illegal content.
Within the executive power, the Ministry of Justice and Public Safety issued an administrative ordinance on 12 April 2023, to combat illegal, harmful, or damaging content on social media platforms, based on existing consumer protection laws. Social media platforms are now required to take specific measures, such as the immediate removal of content upon request from competent authorities, systematic risk assessment, and active content moderation.
Senacon, the National Consumer Secretariat, is responsible for enforcing these measures and holding social media platforms accountable for any breaches of their duty of care and security obligations. Additionally, platforms must inform Senacon about the parameters guiding their algorithmic recommendation systems. Senacon has already sent notifications to social media platforms asking them to take action against violence in educational institutions. Companies have been given 72 hours to report on the measures they will take to monitor, limit, and restrict content that incites violence against schools and students.
Finally, the government has proposed amendments to the so-called “Fake News Bill,” a bill currently under examination by the National Congress that aims to increase transparency and accountability for internet platforms. The proposed amendments reportedly include provisions that require platforms to take preventive action against “potentially illegal content” generated by third parties, such as content promoting “violent overthrow of the democratic rule of law”.
However, negotiations are being conducted behind closed doors with little transparency, and the lack of proper engagement and participation of multiple sectors of society has raised concerns about the decision-making process guiding the government's approach to the future of the internet in Brazil. While previous Workers’ Party governments were praised for their level of participation in the digital agenda, the current government's commitment to participation appears to have dwindled. Based on public declarations by government officials, government officials have looked to foreign experiences in constructing a new regulatory model, particularly the Digital Services Act (DSA) in the European Union and the Online Safety Bill (OSB) in the UK. One of the core proposals is to create a duty of care for digital platforms that mirrors the obligations set forth in the OSB, However, the specific contours of this duty of care in Brazilian law remain unclear.
The future of platform regulation in Brazil
There is little doubt that well-defined parameters and requirements are necessary to improve the supervision and governance structures of digital platforms. But there is no simple formula for achieving this, and any platform regulation in Brazil must be tailored to the country’s unique reality, rather than simply copied from other jurisdictions.
Although new procedural rules should be established democratically and through a broad participatory process, regulations based on predetermined and rigid concepts of what should be removed from platforms are unlikely to be effective. This can result in a frustrating game of ‘whack-a-mole’.
Instead, platform regulation should focus on developing internal mechanisms for appropriate governance to handle various types of problematic content. Regulations should incentivize platforms to adopt a range of responses beyond simply removing or leaving content online, including measures to reduce the reach of harmful messages. Vague and overly broad terms describing the types of content that should or should not be online can lead to over-removal of content, with serious implications for freedom of speech. Additionally, it is important to remember that once the bill becomes law, it will outlast the current government. The same tools meant to protect the rule of law could be misused by less democratic governments in the future, especially with content related to politics and the electoral process.
It is also important to remember that the legislative, judiciary, and federal government are not the only players in Brasilia attempting to tame digital platforms. The digitalization of the economy and digital technologies have implications across multiple and interconnected policy domains. The business models of digital platforms raise concerns in various areas, including data protection, competition policy, and law enforcement. This means that digital regulation permeates the activities of several government institutions in Brazil, emphasizing the need for effective inter-agency coordination, including between the Data Protection Authority, the Competition Authority, other sector regulators, and other government bodies.
As with any complex policy domain, effective inter-agency coordination will be critical. The need to increase coordination and communication between the bodies and agencies that comprise the public administration has become evident, particularly when it comes to the digital agenda.
In my research, I have examined the challenges and possible ways forward for inter-agency coordination. I recently published a policy brief motivated by the Provisional Measure that turned the Brazilian Data Protection Authority into an autonomous agency last year. In this brief, I discussed the challenges that the Data Protection Authority will face in interacting with the ecosystems of regulators and public bodies implicated in digital regulation in Brazil. While recent efforts around platform regulation have been a baptism by fire for these bodies, there is still a long way towards building resilient and effective channels for collaboration and cooperation with other agencies, beyond any personal ties of the regulators and officials.
More broadly, the discussion around the regulation of digital platforms represents an opportunity to also address the concentration of economic power in some of these companies by modifying the structures of digital markets. There is mounting evidence that control over infrastructure and assets by gatekeeper platforms can create an uneven playing field that harms end-users and smaller platforms. Therefore, future platform regulation should not only force platforms to evaluate their resources and structures regarding the risk of causing harm to individuals but also to take appropriate measures to mitigate systemic and market risks regarding their size and power.
In short, the future of digital platform regulation in Brazil lies in the democratic construction of an innovative, systemic, and structuring regulatory framework that takes into account the cross-cutting policy areas and institutions that are implicated in the Brazilian context. If the Lula administration focuses too narrowly on changes to the liability model of article 19 of the Marco Civil da Internet and the discussion of which types of content should or should not be allowed online, it will miss a valuable opportunity to debate governance models capable of increasing the accountability and transparency of digital platforms in Brazil. This dialogue should not be limited to harmful content but should also consider the promotion of more competitive and innovation-friendly digital ecosystems.