On February 26, 2024, the federal government introduced  Bill C-63, entitled An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and an Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts (the Online Harms Act). This new legislation promotes online safety and addresses rising concerns related to the spread of hateful, violent, sexually intimate, and other harmful content on social media. While Bill C-63 has only received its first reading in the House of Commons and will likely undergo several changes, the first draft establishes a broad set of duties and powers related to the regulation of harmful content on social media and highlights the federal government's intention to address this sensitive issue.

OBJECTIVES, DUTIES & POWERS IN THE ONLINE HARMS ACT

Bill C-63's introductory materials reinforce that the purpose of the new legislation is to promote the online safety of persons in Canada, reduce harm caused to persons in Canada as a result of harmful content online, and ensure that the operators of social media platforms are transparent and accountable with respect to their duties under the Online Harms Act.

The bulk of the new legislation can be divided into two categories: the imposition of new duties on the operators of social media services to protect their users from being exposed to harmful content; and the creation of new administrative entities and accompanying enforcement powers to administer and enforce those duties.

Duties on Social Media Operators

The substantive requirements introduced in the Online Harms  Act  are directed at the operators of social media services, which the legislation defines as websites or applications that are accessible in Canada and have the primary purpose of facilitating online communications among users by enabling them to access and share content. Notably, the Online Harms Act  expressly clarifies that the definition of “social media services” includes both live streaming services and adult content services that offer access to pornographic content.

Specifically, among other requirements, the Online Harms Act  imposes four key duties on operators of social media services:

  1. A duty to act responsibly in operating the social media service, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the service;
  2. A duty to protect children by integrating into the social media service certain required design features, which are to be set out in more detail in regulations that will accompany the Online Harms Act;
  3. A duty to make content that sexually victimizes or revictimizes a survivor, as well as intimate content communicated without consent, inaccessible to Canadians on the social media service in certain circumstances; and
  4. A duty to keep all necessary records to assess whether the operator of the social media service is complying with its duties under the Online Harms Act.

In the current draft of the Online Harms Act, the definition of “harmful content” is wide in scope and includes:

  • Intimate content communicated without consent;
  • Content that sexually victimizes a child or revictimizes a survivor;
  • Content that induces a child to harm themselves;
  • Content used to bully a child;
  • Content that foments hatred;
  • Content that incites violence; and
  • Content that incites violent extremism or terrorism.

In furtherance of the general duties that the Online Harms Act  imposes on social media service operators, the legislation also includes specific requirements that operators must follow when operating their social media platforms in Canada. For example, the platform must have a mechanism that allows users to block other users; a system whereby users can report harmful content and receive notifications from the platform on the status of their report and any action taken by the platform in response; and a publicly available set of user guidelines that specifically address how harmful content is handled on the platform. Social media service operators are also required to submit a “digital safety plan” that contains certain information outlining how the operator is complying with its duties under the Online Harms Act. The list of specific requirements for social media service operators to follow may grow, as the draft legislation contemplates the creation of accompanying regulations that may set out additional protective design features applicable to social media platforms.

Administration and Enforcement

The other key aspect of the Online Harms Act  is its establishment of several new administrative entities and accompanying powers given to those entities to administer and enforce the legislation.

If enacted, the Online Harms Act  will create the Digital Safety Commission of Canada (the Commission) with a mandate to administer and enforce the Online Harms Act, monitor the transparency and accountability of social media service operators in their compliance with the Online Harms Act, and develop online safety standards. As part of this mandate, social media users will be able to submit complaints to the Commission regarding content on a social media service that sexually victimizes a child or revictimizes a survivor, as well as intimate content communicated without consent, and the Commission will be able to make orders requiring social media service operators to render such content inaccessible on their platforms.

The Online Harms Act  contemplates broad Commission authority, including the power to appoint inspectors, hold hearings, issue compliance orders to social media service operators, and impose heavy administrative monetary penalties on operators who fail to comply with their duties under the Online Harms Act  or an order of the Commission. The Commission will also be authorized to provide accreditation to researchers and other entities that engage in education, advocacy and awareness activities related to the purposes of the Online Harms Act, which would allow them to gain access to electronic data included in social media service operators' digital safety plans for the purpose of furthering the objectives of the Online Harms Act.

To supplement the Commission, the Online Harms Act, if enacted, will also establish a Digital Safety Ombudsperson of Canada (the Ombudsperson) to support social media users and advocate online safety in the public interest, as well as a Digital Safety Office of Canada that will generally support the Commission and the Ombudsperson in the implementation of their duties.

KEY TAKEAWAYS & NEXT STEPS

While still in the early stages, the proposed Online Harms Act  represents a significant step forward in the Canadian government's stated commitment to address the prevalence of harmful content on social media platforms and the need to protect Canadians, particularly children, from being exposed to such content. While intended to apply across Canada, the Online Harms Act  is not the first piece of legislation of its kind — for example, in January 2024, British Columbia's  Intimate Images Protection Act came into force, which establishes an expedited process by which a person whose intimate images have been distributed without their consent can seek an order to stop the spread of those images in that province.

That said, questions about Bill C-63 still remain, and it is likely to go through numerous changes on its way to potential passage. While many have praised Bill C-63 for its larger objectives and for the significant infrastructure it proposes to dedicate to combatting harmful online content, others have expressed concerns that the draft legislation is overbroad and vague. For example, some critics of the current draft of Bill C-63 believe that the scope of content captured under the definition of “harmful content” is unreasonably wide and raises potential freedom of expression issues. Critics have also raised concerns about how social media platforms will implement some of the more broadly worded duties in practice, as the sheer volume of content on larger social media platforms and the “gray area” into which harmful content often falls makes it difficult and arguably unrealistic with current technological measures for a social media platform to catch and manage all harmful content that users may encounter on its platform.

The Cassels Entertainment & Sports Group will continue to monitor the progression of Bill C-63 and related developments in this space. For more information about Bill C-63, the Online Harms Act,and how they could affect your business, please contact any member of our group.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.