The ongoing expansion of Neurotechnology (or "neurotech") for consumers is raising questions related to privacy and ownership of one's thoughts, as well as what will happen when technology can go beyond merely influencing humans and enter the realm of control.
Last year, a group of McGill students built a mind-controlled wheelchair in just 30 days.1 Brain2Qwerty, Meta's neuroscience project which translates brain activity into text, claims to allow for users to "type" with their minds.2 Neuralink, a company founded by Elon Musk, is beginning clinical trials in Canada testing a fully wireless, remotely controllable device to be inserted into a user's brain.3 This comes several years after the company released a video of a monkey playing videogames with its mind using a similar implantable device.
In this bulletin, we explore some legal considerations that could arise as neurotech becomes increasingly accessible.4
What is Neurotech?
Neurotech refers to technology that records, analyzes or modifies the neurons in the human nervous system. Neurotech can be broken down into three subcategories:
- Neuroimaging: technology that monitors brain structure and function;
- Neuromodulation: technology that influences brain function; and
- Brain-Computer Interfaces or "BCIs": technology that facilitates direct communication between the brain's electrical activity and an external device, sometimes referred to as brain-machine interfaces.5
In the medical and research context, neurotech has been deployed for decades in one form or another. Neuroimaging techniques such as EEG, MRI and PET have been used to study and analyze brain activity.6 Neuromodulation has also been used for the treatment of various diseases, such as for deep brain stimulation for Parkinson's disease7 as well as for cochlear implants.8 However, the potential for applications of neurotech beyond medical devices is a newer development, accelerated by the arrival of less intrusive neurotech devices, and innovations in artificial intelligence.
Legal Considerations for Neurotech
The following are some of the novel legal questions that we foresee arising from the expansion of consumer-facing neurotech. Besides apparent product liability issues that may arise with any consumer product,9 neurotech could have implications for medical device regimes, privacy and data protection, intellectual property, as well as the laws of evidence.
1. Health Canada Oversight
Any neurotech launched in Canada, whether as a medical device or as a consumer product, will be subject to Health Canada's existing regulatory frameworks.10
If a neurotech product constitutes a "device" under the Food and Drugs Act,11 it will be subject to Health Canada's medical device regime. "Device" is defined broadly and would apply to nearly all neurotech intended for medical uses.12 Indeed, Health Canada has already issued Class II medical device licences for neurotech earbuds, caps, and associated accessories used for health purposes.13
Medical devices are classified into four risk-based categories (Class I-IV), with higher classes requiring more rigorous approval. Health Canada will view a device as "higher risk" to the extent it is more invasive (penetrating the body) and active (requires a source of energy other than that provided by the human body).14 Neurotech may be seen as higher risk if it requires surgical implantation and uses external energy sources. All manufacturers and distributors of medical devices must obtain a Medical Device Establishment Licence (MDEL), and any Class II-IV devices require a Medical Device Licence (MDL) before being marketed.15
Neurotech devices used for entertainment, productivity, or other purposes would not necessarily be captured by Health Canada's medical device regime. However, any neurotech that is sold to consumers would still be subject to Health Canada's regulatory oversight as a "consumer product" under the Canada Consumer Product Safety Act ("CCPSA").16 A "consumer product" is also broadly defined and includes any product that may be obtained by a consumer for non-commercial purposes, including domestic, recreation and sports purposes.17 While the CCPSA does not have neurotech specific regulations, it contains a general prohibition against any consumer products that poses "a danger to human health or safety."18
If it constitutes a consumer product or medical device, neurotech would be subject to the regulatory oversight of Health Canada, including the obligation to recall any device or product that may present a risk of injury to health.19
2. Privacy and Data Protection
Neurotech relies on raw brain data, and information derived from it (collectively, "neural data"), which may raise concerns related to privacy and data laws in Canada.
Neural Data as Sensitive Personal Information: Neural data would likely constitute "personal information" under Canadian privacy laws, which is commonly defined as "information about an identifiable individual."20
Neural data would also likely be seen as sensitive personal information, given the potential risks to individuals associated with processing it. Sensitive information comes with higher expectations regarding consent, appropriateness, and data protection.21
Currently neurotech products have only been able to extract simple signals from raw brain data, such as blood flow to certain regions of the brain (fMRI) or intended hand movements based on electrical signals from the brain (BCI). However, spurred by recent enhancements in artificial intelligence, many have speculated that neurotech could one day detect thoughts, emotions and complex intentions. Raw brain data should therefore be regarded as highly sensitive because, while current algorithms may have limited interpretive power, future advancements could unlock deep insights into thoughts, emotions, intentions, and even unconscious biases, making it a potential goldmine for profiling, manipulation or blackmail.
Therefore, we would expect privacy regulators to apply increased scrutiny to the processing of neural data. For instance:
- Organizations would likely be required to obtain express consent prior to collecting, using or disclosing neural data.
- Organizations would need to demonstrate a serious and legitimate need to process neural data, that is proportionate to the sensitivity of the data.22 For example, it is highly unlikely that a regulator would permit a company to use neural data for advertising purposes.23
- Organizations would be expected to safeguard neural data with extremely robust safeguards and delete it when no longer needed. Security is especially important given novel cyber-security risks of "brainjacking" where a neuromodulation device is hacked by malicious actors, who may use it to induce certain moods, decisions, or actions.24
Neural Data as Biometric Information: Depending on the nature of neural data, it may also be considered biometric information. The Office of the Privacy Commissioner of Canada ("OPC") has defined "biometrics" as referring to a range of techniques, devices and systems that enable machines to recognize individuals, or confirm or authenticate their identities.25 Biometrics are subject to increased obligations under Canadian privacy laws, including express consent and database disclosure requirements under Quebec's Act to establish a legal framework for information technology.26
While most neurotech currently in the public view is not used to identify individuals, many applications of neurotech involve training a computer system on a user's specific neural activity. These systems may therefore be able to identify users as a consequence of their design. This could lead to a regulator determining that neural data constitutes biometric information, which comes with numerous regulatory requirements.
3. Intellectual Property
As neurotech continues to advance, it is possible that it will be able to make sense of complex, subconscious data such as dreams. This will present a host of novel IP challenges, which stem from the unique nature of the data being captured, the potential for the technology to generate new insights, and the fundamental questions about ownership and rights in a realm where personal thoughts become part of the technological process.
Ownership of Summarized Data: When neurotech is able to capture subconscious thoughts, it will likely process this data into summaries that reflect aspects of an individual's mental state. The ownership of such summaries, however, can become contentious. On the one hand, it could be argued that the individual, as the originator of their thoughts, should own the summaries. On the other hand, one could argue that the summaries would not exist but for the processing done by the technology and hence the summaries should not be owned (or exclusively owned) by the individual. The challenge may be in determining whether the summary is a transformation of the data that makes it the product of the technology, or whether it remains simply a condensed version of the individual's thoughts, in which case it makes sense for the individual to retain ownership.
Ownership of Creative Outputs: The situation becomes more complicated if the neurotech produces creative outputs based on the subconscious thoughts captured by the technology. For example, if the neurotech uses subconscious imagery or emotions to create art, music, or other works, who owns the rights to these works? Is the individual whose thoughts were analyzed the creator of the work, or does the technology, which has facilitated and interpreted those thoughts, hold some ownership? This issue is especially pertinent in a world where AI-generated creations are already challenging traditional ideas of IP ownership. For example, in many jurisdictions, ownership of copyrightable works is tied to the individual who conceived them.27 Uncertainty can arise in cases where works are created with neurotech, where the individual whose thoughts are captured may not be aware of the process, or their thoughts may have been altered or combined with other information to produce the works. These uncertainties could have significant implications for IP ownership, compensation, and the extent to which individuals can control or profit from the thoughts embedded in their own subconscious minds.
Unintended Disclosure of Confidential Data: Neurotech could potentially create a loophole in traditional methods of safeguarding confidential information. In a corporate setting, confidentiality agreements and security safeguards are designed to protect trade secrets and confidential information. However, the ability to access thoughts directly bypasses these traditional methods. If an individual's subconscious contains sensitive business information – whether from years of working on a project or half-formed ideas – the information, once captured by neurotech, could be leaked due to, for example, technological malfunctions or breaches of security. Companies would face significant challenges in determining how to protect their confidential information if they are at risk of being unintentionally disclosed through neurotech.
4. Evidence
Both in the civil disputes and criminal prosecutions, external evidence is produced to prove someone's intentions and motives. Witnesses are called to testify on matters based on their own memory. Neurotech and neural data may serve as a more direct source of evidence to prove someone's memory, state of mind or character. However, using brain data to judge someone's character gives rise to serious human rights concerns related to profiling individuals based on inherent characteristics outside of their control.
Admissibility: Although neurotech may provide a "look into one's mind," it remains an open question as to whether a court would admit neural data as evidence. We might look to polygraph tests as an example of how this debate could play out.
Canadian courts have held that polygraphs are not admissible to determine credibility because they contravene various rules of evidence, such as the rule against oath-helping, which prohibits a party from presenting evidence solely for the purpose of bolstering a witness' credibility, the rule against the admission of past or out‑of‑court statements by a witness and the character evidence rule, which restricts the Crown from putting the character of the accused at issue.28 Moreover, courts have emphasized they should not be introduced as expert evidence because "[t]he issue of credibility is an issue well within the experience of judges and juries and one in which no expert evidence is required."29 Even if introduced for other types of evidence, neurotech's scientific novelty may make it subject to special scrutiny regarding its reliability and whether it is essential in the sense that the trier of fact will be unable to come to a satisfactory conclusion without the assistance of the expert.30
Is a Bespoke Neurotech Law Warranted?
Does Canada need a separate legal framework to address new challenges arising from neurotech? Many issues pertaining to neurotech may be addressed under existing legal frameworks. However, it is safe to say that there are some "gaps" that may need to be filled. This is reminiscent of the current debate on whether a bespoke AI law is needed.31
Some have called for a new form of human rights, "neurorights," to be recognized by law.32 Neurorights might include rights related to personal identity; free will and mental privacy; equal access to mental augmentation and protection from algorithmic bias. In 2021, Chile became the first country to do so by enshrining the right to "mental privacy, free will and non-discrimination in citizen's access to neurotech" into its constitution.33 More recently in 2024, the U.S. states of Colorado and California have enacted laws to provide privacy protections specifically from data generated from the brain.34
Conclusion
The state of consumer-oriented neurotech is evolving quickly, including their possible applications and legal risks. McMillan LLP will continue to monitor these developments and provide relevant insights to assist you navigate the legal and regulatory landscape in Canada.
Footnotes
1 François Shalom, "Meet the brains behind McGill's mind-controlled wheelchair", McGill Reporter (May 7, 2019), available here.
2 Luis E. Romero, "Meta's Mind Reader: Brain2Qwerty Translates Thoughts Into Text", Forbes (February 19, 2025), available here.
3 "Join Neuralink's Patient Registry: Current Clinical Trials in Canada", Neuralink, available here.
4 This bulletin is based on some of Neurotech's hypothetical applications, and our current legal framework in Canada. As we discuss later in this bulletin, it is possible that new law could emerge that is specifically tailored to this technology.
5 "Brief of the Scientific Advisory Board on: Neurotechnology", United Nations Secretary-General's Scientific Advisory Board (January 14, 2025), available here.
6 Taryn Bosquez, "Neuroimaging: Three important brain imaging techniques", ScIU (Indiana University Bloomington) (February 5, 2022), available here.
7 "Deep brain stimulation", Mayo Clinic, available here.
8 "Cochlear Implants", International Neuromodulation Society (November 22, 2021), available here.
9 Generally, it is recognized that the manufacturer owes a duty of care not to injure the consumer: Mustapha v. Culligan of Canada Ltd., 2008 SCC 27 (CanLII).
10 In addition to being subject to Health Canada's regulatory oversight, any product with wireless functionality or radiofrequency functionalities will be subject the radiofrequency device certification and labelling requirements as contained in the Radiocommunications Act, RSC 1985, c R-2 and the Radiocommunication Regulations, SOR/96-484.
11 Food and Drugs Act, RSC 1985, c F-27 [Food and Drugs Act].
12 Section 2 of the Food and Drugs Act, RSC 1985, c F-27 defines a "device" as includes any instrument, apparatus, contrivance or other similar article that is manufactured, sold or represented for use in diagnosing, treating, mitigating or preventing a disease, disorder or abnormal physical state, or any of their symptoms, in human beings or animals or in restoring, modifying or correcting the body structure of human beings or animals or the functioning of any part of the bodies of human beings or animals.
13 See active licence listing for "NeuroCatch" for instance, available here.
14 Health Canada "Guidance Document – Guidance on the Risk-based Classification System for Non-In Vitro Diagnostic Devices (non-IVDDs)" (June 12, 2015).
15 "About medical devices", Government of Canada (January 27, 2020), available here.
16 Canada Consumer Product Safety Act, SC 2010, c21
17 The Canada Consumer Product Safety Act, SC 2010, c21 defines a "consumer product" as "a product, including its components, parts or accessories, that may reasonably be expected to be obtained by an individual to be used for non-commercial purposes, including for domestic, recreational and sports purposes, and includes its packaging."
18 Canada Consumer Product Safety Act, SC 2010, c21 ss 7-8.
19 See Health Canada's "A guide for voluntary recall of consumer products or cosmetics in Canada" (last modified July 21, 2023) "Guide for recalling medical devices (GUI-0054): Recall process" (last modified January 22, 2025).
20 For instance, Personal Information Protection and Electronic Documents Act, SC 2000, c 5, s 2(1).
21 California and Colorado legislators recently identified neural data as sensitive under their respective privacy regimes, meaning additional obligations apply when handling such information: Michelle R. Bowling, Dan Jasnow, and D. Reed Freeman Jr., "California and Colorado Establish Protections for Neural Data", ArentFox Schiff (Oct 11, 2024), available here. Rather than a finite list, Canadian privacy laws recognize that any personal information may be sensitive depending on the context and the potential risks to individuals related to the processing of such data: Office of the Privacy Commissioner of Canada ("OPC"), Interpretation Bulletin: Sensitive Information (May 16, 2022), available here. Notably, health data is generally considered to be sensitive.
22 OPC, Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) (May 24, 2018), available here.
23 See for instance, PIPEDA Findings #2022-001, in which the OPC held that targeted advertising was not an appropriate purpose for collecting granular location data, given its sensitivity.
24 Laurie Pycroft, "Brainjacking – a new cyber-security threat", University of Oxford (2016), available here.
25 OPC, Data at Your Fingertips Biometrics and the Challenges to Privacy (Feb 2011), available here.
26 Act to establish a legal framework for information technology, CQLR c C-1.1, ss. 44 and 45.
27 See, for example, Copyright Act, RSC 1985, c C-42, s 13(1).
28 R. v. Béland, 1987 CanLII 27 (SCC), [1987] 2 SCR 398.
29 ibid.
30 R. v. Mohan, 1994 CanLII 80 (SCC), [1994] 2 SCR 9.
31 We note that the government of Canada's proposed Artificial Intelligence and Data Act died on the order paper in January, along with the rest of Bill C-27. It is yet to be seen whether Canada's next federal government will bring back the proposed legislation or continue to leave the issue of AI management up to voluntary corporate commitments, like ISED's Voluntary Code of Conduct.
32 Sergio Ruiz et al., "Neurorights in the Constitution: from neurotechnology to ethics and politics", Philosophical Transactions of the Royal Society B (October 21, 2024), available here.
33 Lorena Guzmán H., "Chile: Pioneering the protection of neurorights", The UNESCO Courier (March 21, 2022), available here.
34 Jonathan Moens, "Your Brain Waves Are Up for Sale. A New Law Wants to Change That.", The New York Times (April 17, 2024), available here; Jessica Hamzelou, "A new law in California protects consumers' brain data. Some think it doesn't go far enough.", MIT Technology Review (October 4, 2024), available here.
The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.
© McMillan LLP 2025