Home

Donate

An Overview of Canada’s Online Harms Act

Mandy Lau / Mar 12, 2024

Bill C-63 had its first reading in Canada's House of Commons on February 26, 2024.

Canada recently tabled Bill C-63, the Online Harms Act, which introduces a new legislative and regulatory framework to reduce harmful content on social media platforms. A new Digital Safety Commission would be created to enforce the framework, and a Digital Safety Ombudsperson would provide support for users and victims, both of which would be supported by the Digital Safety Office of Canada. Amendments to existing statutes are also proposed, including the Criminal Code, the Canadian Human Rights Act, and an Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service.

Types of Harms

The Bill targets seven kinds of harmful content, three of which are specific to children:

  • content that sexually victimizes a child or revictimizes a survivor,
  • content used to bully a child, and
  • content that induces a child to harm themselves.

The four other harms are:

  • intimate content communicated without consent,
  • content that foments hatred,
  • content that incites violent extremism or terrorism, and
  • content that incites violence.

Who is regulated under this Act?

Under the Act, a social media service is defined as “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content” including adult content and live streaming services (Section 2). Should this Bill pass, the Governor in Council may specify more narrow criteria in regulations, such as types of social media services, number of users, or those designated as posing a “significant risk that harmful content is accessible” (Section 3(3)).

Obligations of social media services

Bill C-63 regulates social media operators according to four duties:

  • Duty to act responsibility: Regulated services are required to implement measures to mitigate risks of harm to social media users, develop guidelines, tools and processes to block users, identify and flag harmful content, label automated communication, appoint a resource person to respond to users’ concerns and needs, and set up procedures to preserve removed content that incites violence and content that incites violent extremism or terrorism. Social media services must also submit a digital safety plan to the Commission, including information about compliance, resource allocation, and relevant internal datasets and research findings. Some of these datasets are to be provided to accredited persons for research, education, advocacy and awareness purposes.
  • Duty to protect children: This duty focuses on age-appropriate design features, such as defaults for parental controls, warning labels and safe search for children. It may also include other features that limit access to explicit adult content, cyberbullying content, or content that incites self-harm.
  • Duty to make certain content inaccessible: This duty applies to two kinds of harm: content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. It specifies procedures for taking down this content within 24 hours of identification.
  • Duty to keep records: This is in reference to how the service is complying with its duties.

Failure to comply may result in a maximum administrative monetary penalty of 6% of the gross global revenue of the entity deemed to have violated the Act or $10 million, whichever is greater. Operators convicted of an offense are subject to a maximum penalty of 8% of the operator’s gross global revenue or $25 million, whichever is greater.

The establishment of the Digital Safety Office

A new Digital Safety Commission can audit, issue compliance orders, and penalize social media services. It can also order the removal of content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. Further, it can set online safety standards, engage in research, and develop resources for the public.

The Ombudsperson supports users and victims through ongoing consultations, directs users to appropriate resources, and advocates on behalf of users to the Commission, the Government, or social media platforms.

Key amendments to existing statutes

Criminal Code:

Proposed changes to the Criminal Code include a new definition for hatred and a new stand-alone hate crime offense for crimes motivated by hatred on the basis of race, national or ethnic origin, language, color, religion, sex, age, mental or physical disability, sexual orientation or gender identity or expression, with a maximum penalty of life imprisonment. Penalties are also increased for existing hate propaganda offences (to a maximum of five years imprisonment) and advocating or promoting genocide (to a maximum of life imprisonment). Further, a peace bond may be issued on an individual if there are grounds to believe that they will commit a hate crime or hate propaganda offense.

Canadian Human Rights Act:

Amendments to the Canadian Human Rights Act will make communicating hate speech online a new discriminatory practice. Hate speech is defined as “the content of a communication that expresses detestation or vilification of an individual or group of individuals on the basis of prohibited grounds of discrimination.” It further clarifies that it is not detestation or vilification if the content solely “expresses disdain or dislike or it discredits, humiliates, hurts or offends.” The prohibited grounds are race, national or ethnic origin, color, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability or conviction for an offense for which a pardon has been granted where a record suspension has been ordered. These amendments target public communications by social media users and will not apply to private communication, social media service providers, broadcasters, intermediaries or other telecommunications infrastructure.

An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service:

The proposed changes to this Act will expand the types of internet services regulated to include online platforms and apps, increase evidentiary preservation from 21 days to 12 months, require transmission data to be gathered, and change the limitation period from two to five years.

A short history of Bill C-63

  • The first legislative attempt to address online harms was Bill C-36. It was introduced in 2021 along with a discussion guide and technical paper, just as the Canadian Liberal federal government prepared to call an election seeking a third consecutive term and a majority government.
  • Public consultations on this regulatory approach occurred between July and September 2021, when Parliament was dissolved and an election was officially called, ending Bill C-36.
  • A new Bill was promised as part of the Liberal Party election campaign, one that would be prioritized within the first 100 days of the new government.
  • However, once elected (as a minority government), it would take until February 2022 to only release a summary of the original consultations, about 150 days in office.
  • A series of further consultations soon followed: An expert advisory panel convened between March and June 2022 to provide recommendations for a new legislative and regulatory framework.
  • The June 2022 Citizens’ Assembly on Democratic Expression reviewed these recommendations and additional inputs to provide further guidance.
  • Between July and November 2022, nineteen public roundtables were held to listen to the perspectives of victims of online harms, marginalized communities, and tech platforms.
  • Finally, an Indigenous sharing circle and one-on-one interviews were conducted between November 2022 and January 2023. These consultations form the basis of the new Online Harms Act (Bill C-63), tabled in parliament on February 26, 2024.

Preliminary analysis: Some key observations

Overall, Bill C-63 was largely welcomed and seen as a big improvement over the first iteration by leaders of various Internet advocacy groups (i.e., Open Media), anti-hate groups (i.e., Canadian Race Relations Foundation), child-safety advocates (i.e., Canadian Centre for Child Protection, Carol Todd) and other engaged researchers and advocates (i.e., Facebook whistleblower Frances Haugen and a group of academics organized by the Center for Media, Technology and Democracy). Some of the most contentious aspects of the original regulatory approach were removed, such as a requirement stipulating removal of all types of harmful content within 24 hours and its subsequent mandatory reporting to law enforcement agencies, as well as website blocking and proactive monitoring. The new approach also aims to respond to harms posed by emerging technologies, such as generative AI.

Political reception will be crucial. Being a minority government, the centrist Liberal Party led by Prime Minister Justin Trudeau would need to secure the support of another political party to pass this legislation. The left-leaning New Democratic Party (with whom the Liberal Party has a “supply and confidence” agreement) conveyed early support, clarifying that they will seek to “enhance algorithmic transparency.” By contrast, Canada’s official opposition party, the right-leaning Conservative Party, has vehemently opposed the Bill throughout the process. Even before seeing the new Bill, Conservative leader Pierre Poilievre preemptively characterized it as “an attack on freedom of expression.” He also leveraged the language of the right-wing anti-Trudeau movement by referring to “Justin Trudeau’s woke authoritarian agenda.”

Focus on children

Compared with the prior legislative attempt, the proposed Online Harms Act has a much stronger focus on protecting children. Out of the eight explicit purposes of the Bill, two are directed towards protecting children: “to protect children’s physical and mental health” and “make content that sexually victimizes a child...inaccessible online.” A focus on reducing harm to children is also evident in the list of targeted harms (three of the seven harms implicate children), the duties of social media platforms (two of four duties are related to children), and the major statutory amendments (expanding mandatory reporting of Internet child pornography). A crude word count reveals 85 instances of the word “child” or “children” throughout the Bill.

The public communications from the government are also centered on child protection. The announcement of the Bill to the press prominently featured a mother whose young daughter was a victim of child sexual abuse exacerbated online. The first press release also foregrounds the harm to children posed by social media, with strong appeals to parents. This strategy is likely informed by the bipartisan support of the Kids Online Safety Act and high interest in the recent Senate hearing in the neighboring United States. It may also be vital to leveraging constructive relationships with social media companies. Meta, for example, has already signaled a willingness to work with Canadian lawmakers on the Online Harms Act. This gesture is shown despite failed negotiations over Bill C-18 (Online News Act), a law that requires Google and Meta to compensate Canadian news sites for sharing their news content on their platforms, the pass of which resulted in Meta ending access to Canadian news on its platforms.

Emphasis on freedom of expression

Notably, freedom of expression is featured more explicitly in this Bill. Freedom of expression is referenced in the third and fourth stated purposes of the Bill: to “mitigate the risk that persons in Canada will be exposed to harmful content online while respecting their freedom of expression” and to “enable persons in Canada to participate fully in public discourse and exercise their freedom of expression online without being hindered by harmful content (emphasis added).” Public communications also stated early on that:

“Everyone in Canada should be able to access an online environment where they can express themselves freely, without fearing for their safety or their life. The Government of Canada will always respect Canadians’ constitutional right to freedom of expression, which is essential in a healthy democracy. However, there is also an urgent need for better safeguards for social media users, particularly children.”

The emphasis on freedom of expression could indicate the strategy of the Liberal Party to counter claims by the Conservative Party. For example, Conservative leader Poilievre portrayed the Bill as the government “banning opinions that contradict the Prime Minister’s radical ideology.” This political move is similar to how Conservatives responded to Bill C-18 (Online News Act) by painting it as a censorship law by Prime Minister Trudeau to “control the news Canadians see.” Centering freedom of expression as a stated purpose could be one way to control the narrative in public discourse. It could also be a way to temper the concerns of legislative overreach, such as those raised by the Canadian Civil Liberties Association.

Interestingly, the term “moderation” or references to content moderation systems and processes are absent in Bill C-63 and public communications, even though it was a key feature of the previous Bill C-36. Perhaps this was intended to address concerns around proactive monitoring or the perception that content moderation equates to the arbitrary removal or de-amplification of speech.

Re-defining hate speech

The Canadian Human Rights Act amendments will likely foster heated debate as it reinstates a revised section 13 previously repealed under a prior Conservative government. Already, Conservative leader Poilievre is shaping the narrative by saying, "What does Justin Trudeau mean when he says the words 'hate speech'? He means the speech he hates. You can assume he will ban all of that." The Liberal government addresses this claim in its backgrounder by highlighting that the definition of hate speech “targets only an extreme and specific type of expression, leaving almost the entirety of political and other discourse untouched.” The definition of hate and hate speech in the amendments will likely continue to be refined and clarified, as seen in this recent media briefing. Further, it remains to be seen how dominant framings will shape the boundaries of what constitutes hate and hate speech. These definitions will have particular ramifications for marginalized communities, such as Palestinian activists and supporters, or communities who have faced structural state violence, such as Black and Indigenous communities.

Misuse or abuse of this section is also addressed by specifying a process for the Canadian Human Rights Commission to dismiss complaints that do not meet the legal threshold. It also sets out processes to protect the confidentiality of victims and witnesses, which acknowledges how justice systems are at times weaponized to intimidate and harass victims or witnesses of hate speech.

Finally, as with the amendment sought in the Criminal Code, the hate speech amendment in the Human Rights Act is actually not directly related to regulating social media platforms. In fact, exemptions are specifically carved out for social media platforms. The exemptions contradict the primary claim in the Government’s public communications that the Online Harms Act is a “law to make online platforms responsible for addressing harmful content and for creating a safer online space that protects all people in Canada, especially children.” Critics from the expert advisory committee have referred to the two amendments as “poison pills” that should be removed from the Bill.

Looking ahead

The Online Harms Act has just passed its first reading in the House of Commons. It will be debated and subject to further study during the second and third readings, where amendments will be proposed and voted on. Once adopted, it will go through the Senate following a similar process. If the Bill passes through both houses, it will be subject to the Governor General’s approval before it becomes law.

Authors

Mandy Lau
Mandy Lau is a PhD candidate in Linguistics and Applied Linguistics at York University in Toronto, Canada. She is broadly interested in language policy and language ideology within digital culture. Her dissertation explores the regulation of harmful speech on social media. As a former public-school ...

Topics