top of page
  • Writer's pictureDurham Pro Bono Blog

Can the Law Keep Up with the Emergence of Neurotechnology?

Updated: Mar 26, 2021

Disclaimer: The views expressed are that of the individual author. All rights are reserved to the original authors of the materials consulted, which are identified in the footnotes below.

By Maiya Dario

Neurotechnology encompasses technology that allows us to better understand the brain by recording and displaying patterns of brain activity. Today, a wide and rapidly expanding variety of neurotechnologies are available. Benefits of these include the diagnosis and treatment of neurological conditions, “marketing analysis” and “personalised technology use.” [1] For instance, functional magnetic resonance imaging (fMRI) indirectly measures the brain’s electrical activity by recording this, graphically displaying its patterns and determining its intensity. Consequently, the brain can be mapped to detect irregularities, allowing the diagnosis of neurological conditions such as Alzheimer’s disease. [2] Despite these benefits, lawyers and neuroethicists have expressed their concerns over the threats introduced by neurotechnology and the gaps in the law that fail to mitigate these. For instance, Kemper (2020) has warned that neurotechnology could be used to establish totalitarian regimes by exacerbating state access and surveillance. [3] Hence, neuroethicists advocate for stronger regulation, one way being through the law. This article will discuss two suggestions propounded by Rainey et. al (2020) regarding the amendment of Article 9.1 of The European General Data Protection Regulation (GDPR); [4] and Ienca and Andorno (2017) regarding the development of existing human rights and the introduction of new rights. [5]

The European Data Protection Regulation

The GDPR aims to allow EU citizens to have more control and protection over their data by providing parties (e.g. corporations) obligations regarding the processing and exporting of this data. [6]

Article 9.1 stipulates special categories of data, otherwise known as sensitive data, that are determined by recording purpose. These include biometric data recorded to identify a person and general data recorded to reveal information about one’s political opinions, health and sexual orientation. [7] Sensitive data is afforded a higher level of protection than non-sensitive data since sensitive data cannot be processed unless there is a legal basis under Articles 6 (“lawfulness of processing”) [8] and 9.2 (lawfulness of processing sensitive data). However, Rainey et. al (2020) argue that determining sensitive data based on recording purpose is inadequate for brain recordings. Using brain recordings involves extracting signals relevant to a particular purpose from the general recording. Though these signals are recorded for a purpose under Article 9.1 and thereby come within its protection, the remainder of the general recording isn’t recorded for that purpose and thus, can evade regulation. This is concerning since algorithms operating on extensive amounts of data can process or repurpose remainders of general recordings, allowing sensitive data, such as information about one’s identity, to be revealed. [9] This concern is demonstrated in the growing consumer context, where third parties can reuse brain data. [10] For instance, neuromarketing firms such as EmSense and MindLab International, use brain data to predict consumer preferences and even engrave or prompt these. This is demonstrated in Neurofocus’s experiment, where they “tested subliminal techniques” to induce responses that people cannot consciously comprehend, such as preferring one item over the other (Penenberg, 2011). [11]

Therefore, instead of determining sensitive data based on recording purpose, Rainey et. al (2020) argue that this should be determined based on the probability that the data has for revealing sensitive information once processed or repurposed. This is justified since brain data reveals information equally as sensitive as information that special categories of data under Article 9.1, which are afforded special protection, reveal. Furthermore, its probability for revealing sensitive information and identifying persons once processed or repurposed is very high, especially considering algorithmic advancements and unlimited data storage (e.g. the cloud). [12]

New human rights: “neurorights”

International human rights law doesn’t explicitly provide for neuroscience, unlike other biomedical developments such as genetic technology, for which the Universal Declaration on the Human Genome and Human Rights was enacted. This aims to prevent genetic information from being used in ways that breach human rights. [13] Consequently, Ienca and Andorno (2017) argue that the increasing availability and significance of neurotechnology require “further developments” of traditional rights or the introduction of new rights to address the challenges posed by neurotechnology. [14]

Cognitive liberty is the principle that guarantees the right to use neurotechnology to alter “one’s mental states” and to refuse to do so (Bublitz, 2013). This is essential since the right to control one’s mind is a necessary foundation for other liberties, including the following: [15]

Right to Mental Privacy

Article 8.1 of the EU Charter of Fundamental Rights (CFR) provides “the right to the protection of” one’s “personal data.” [16] Traditional privacy laws such as this aim to protect people’s external information. However, brain data is distinctive as it’s not purely external. Since it isn’t easily separable from a person’s neural activity, it can be linked to the person from whom it originated. For instance, electroencephalogram-recorded signals, which use electrodes to record the brain’s electrical activity, can be an unique biometric identifier, similar to DNA. [17] Furthermore, people engage in involuntary, subconscious processes when cognitive abilities are used. Consequently, brain data from these processes are recorded without the person’s awareness. This conflicts with Article 8.2 of the CFR, which provides that data must be processed based on consent, among others. [18] Giving informed consent becomes difficult since a large amount of brain data that will be recorded and used is unknown to the person. Hence, Ienca and Andorno (2017) argue that today’s data protection and privacy rights cannot accommodate these challenges introduced by neurotechnology. Thus, they advocate for this right, which protects people from their brain data being illegitimately accessed and disseminated. [19]

Right to Mental Integrity

Article 3 of the CFR provides the right to physical and mental integrity. Under this, mental integrity is understood as protection from mental illness by guaranteeing people with mental conditions the right to access mental health services. [20] However, Ienca and Andorno (2017) argue that Article 3 should develop to protect people’s mental wellbeing from possible harm including those that threaten their autonomy and physical wellbeing. [21] Neurotechnology could be hacked, altering a person’s neural processes (otherwise known as brainjacking) without their consent, resulting in potential harm. This concern is demonstrated in brain stimulation. For instance, consumer transcranial direct current stimulation, which uses direct electrical currents to modulate neuronal activity, stimulating parts of the brain, was designed to safely function in a certain frequency range. However, third parties can still manipulate the frequency since there are no sufficient safeguards to prevent them from doing so. [22]

Right to Psychological Continuity

Articles 22 and 29 of the Universal Declaration of Human Rights provide the right to have and develop a personality. [23]

Psychological continuity is the requirement for personal identity to remain constant (Klaming and Haselager, 2013). [24] Parts of the brain responsible for emotions and behaviour can be altered without one’s consent (or brainjacked), affecting their personality. One instance of this is neuromarketing, specifically Neurofocus’s experiment, as mentioned above. Another, more extreme instance is new forms of brainwashing. Holbrook et. al (2016) used transcranial magnetic stimulation (TMS), which uses magnetic fields to stimulate parts of the brain, altering parts of the brain that govern “social prejudice and political and religious beliefs.” Through temporarily shutting down the posterior medial frontal cortex, it was demonstrated that TMS could be used to prompt a wide variety of alterations of one’s attitudes and beliefs. [25] Thus, this right aims to preserve a person’s personal identity and the coherence of one’s behaviour from unconsented alteration by third parties. [26]


In conclusion, the current law cannot keep up with the emergence of neurotechnology since it has gaps that cause its failure to address neurotechnology’s challenges and threats. Therefore, reform is needed for the law to keep up with the emergence of neurotechnology.


[1] Roberto Andorno and Marcello Ienca, ‘Towards new human rights in the age of neuroscience and neurotechnology’ (2017) Life Sciences, Society and Policy 2017 13(5), 1 p.23

[2] Cited above at n.1, p.3

[3] Carolin Kemper, ‘Technology and Law Going Mental: Threads and Threats of Brain-Computer Interfaces’ (verfassungsblog, 31 August 2020) <> accessed 20 February 2021

[4] Simi Akintoye, Christoph Bublitz, Tyr Fothergill, Kevin McGillivray, Stephen Rainey and Bernd Stahl, ‘Is the European Data Protection Regulation sufficient to deal with emerging data concerns relating to neurotechnology?’ (2020) Journal of Law and the Biosciences 2020, 1

[5] Cited above at n.1

[6] Ben Wolford, ‘What is GDPR, the EU’s new data protection law?’ (, n.d.) <> accessed 20 February 2021

[7] intersoft consulting, ‘Art. 9 GDPR: Processing of special categories of personal data’ (, n.d.) <> accessed 21 February 2021

[8] intersoft consulting, ‘Art. 6 GDPR: Lawfulness of processing’ (, n.d.) <> accessed 21 February 2021

[9] Cited above at n.4, pp.14-15

[10] Cited above at n.4, p.13

[11] Adam L. Penenberg, ‘NeuroFocus uses neuromarketing to hack your brain.’ (, 8 August 2011) <> accessed 22 February 2021

[12] Cited above at n.4, pp.17-19

[13] Cited above at n.1, p.8

[14] Ibid.

[15] J.C. Bublitz. ‘My Mind is Mine!? Cognitive Liberty as a Legal Concept.’ In: Hildt E, Franke AG, eds. Cognitive Enhancement. An Interdisciplinary Perspective. (2013) Dordrecht: Springer p. 233–64.

[16] European Union Agency for Fundamental Rights, ‘Article 8- Protection of personal data’ (, n.d.) <> accessed 20 February 2021

[17] Cited above at n.1, p.14

[18] Cited above at n.15

[19] Cited above at n.1, p.15

[20] European Union Agency for Fundamental Rights, ‘Article 3- Right to integrity of the person’ (, n.d.) <> accessed 22 February 2021

[21] Cited above at n.1, p.18

[22] Cited above at n.1, p.19

[23] United Nations, ‘Universal Declaration of Human Rights’ (, n.d.) <> accessed 23 February 2021

[24] Laura Klaming and Pim Haselager, ‘Did My Brain Implant Make Me Do It? Questions Raised by DBS Regarding Psychological Continuity, Responsibility for Action and Mental Competence.’ (2013) Neuroethics 2013 6(3), 527–539

[25] Choi Deblieck, Daniel M.T. Fessler, Colin Holbrook, Marco Iacoboni and Keise Izuma, ‘Neuromodulation of group prejudice and religious belief.’ (2016) Social Cognitive and Affective Neuroscience 2016 11(3), 387-394

[26] Cited above at n.1, p.21

180 views0 comments

Recent Posts

See All

The controversy surrounding facial recognition

By Holly Downes Facial recognition is a technology that has maximised our safety and protection, served justice by solving crime cases, and has made identification safe through authentication, yet the


bottom of page