Open access peer-reviewed chapter - ONLINE FIRST

Perspective Chapter: Making Space for Neuro Rights in the Context of Brain-Computer Interfaces: One Small Step for Human Rights, One Giant Leap for Mankind

Written By

Marietjie Botes

Submitted: 02 June 2023 Reviewed: 01 August 2023 Published: 23 November 2023

DOI: 10.5772/intechopen.112737

New Insights in Brain-Computer Interface Systems IntechOpen
New Insights in Brain-Computer Interface Systems Edited by Nasser Kashou

From the Edited Volume

New Insights in Brain-Computer Interface Systems [Working Title]

Dr. Nasser H Kashou

Chapter metrics overview

54 Chapter Downloads

View Full Metrics

Abstract

Brain-Computer Interfaces (BCIs) are compelling technologies that allow for bidirectional communication and control between the human brain and the outside world (via a computer) by exchanging brain activity data. Although being admired for their clinical promises, they raise novel ethical and legal issues. Legal debates centre around patient autonomy, equity, data protection and security, dehumanization, machine learning-based decision-making limitations, and the influence of BCIs on human rights—and whether we need new rights to protect our neuro data and mental privacy or not. This chapter will reconsider existing rights and weigh up the argument for sui generis rights.

Keywords

  • human rights
  • brain-computer interfaces
  • privacy
  • autonomy
  • neuro rights
  • neuro data
  • personal data

1. Introduction

Brain-Computer Interfaces (BCIs) are neuro-technologically enabled devices that allow two-way communication between an individual’s brain and external systems, such as computers. This bidirectional connection enables two kinds of uses associated with BCIs. First, it enables the acquisition of neuronal data produced by the neural activity of an individual and its consequent transmission to a computer for processing and analyses [1]. Second, it can facilitate the provision of stimuli to, or the inhibition of brain activity to regulate abnormal impulses or improve motor actions at a neuronal level [2]. Neuro technologies such as these can include a wide variety of technologies that can be either invasive (implanted) or non-invasive, and either therapeutic or non-therapeutic.

For example, a research team in Singapore has recently developed an artificial intelligence (AI)-enabled scanning machine that uses a basic mind-reading technique to decode brain scans to reproduce images that a person is mentally picturing [3]. This technique puts the brain signals, collected from participants during their viewing of a dataset of 160,000 images of a wide variety for 9 s each, through an AI model called the “MinD-Vis” to train it to ultimately associate certain brain patterns with particular image features like color, shape, texture, and semantics, thus reading and reconstructing images from participants’ minds. This technology that has been referred to as a “mini GPT for the brain” [4] could not only potentially transform the communication skills of people with disabilities, allowing them to convey messages by purely using their minds, but it also raises novel ethical and legal issues with consideration of how these technologies can evolve and be used for in future [5].

Traditionally, BCI technologies have been used for medical purposes, including the treatment of neurodegenerative diseases such as Alzheimer’s and the autonomic nervous system [6], the detection of anomalous neural activity [7], and mental speech systems where the BCIs analyze the user’s neural activity to determine each spelled letter [8]. But due to technological advancements, the application field of BCI technologies has expanded into fields other than medicine. Studies now explore the military uses of BCI and how they could be used for the telepathic handling of multiple drones at a distance [9], and even exoskeletons [10]. BCIs are also used to monitor soldiers’ mental states to assess their cognitive and emotional responses to complex situations, including investigations into the possible augmentation of soldiers’ mental capabilities in physical, cognitive, and emotional dimensions [11]. These, and emerging technologies and their applications not only present considerable engineering challenges such as the miniaturization of BCI implants to reduce damage caused by surgical procedures, increasing resolution and coverage of the brain [12], allowing people to access the Internet with their minds [13], or even allowing direct mental communication between people [14], but also novel legal and ethical challenges.

Currently, the legal-ethical requirement of consent prevents someone from forcibly reading someone’s mind or collecting neuro data but advancements in the sophistication of neuroscience, AI, software, and devices raise the risk that these technologies may in future be able to enter a person’s mind without permission, or even without such a person being aware of it. Such possible future intrusions of one’s privacy, the influence that these technologies may have on one’s autonomy and the control a person has over his or her own life, and the security of neuro data against hackers are just some of the many and emerging legal-ethical issues that beg us to rethink the effectiveness of our existing human rights and whether these challenges require sui generis rights. In this chapter, I shall consider this question by discussing three aspects, founded on human rights, most influenced by BCIs: (1) data privacy; (2) identity; and (3) cybersecurity.

Advertisement

2. “Brain rights”

Human rights, as we know of them today, were created by instruments such as the United Nations Universal Declaration of Human Rights (UDHR) [15] and the International Covenant on Civil and Political Rights (ICCPR) [16], well before the existence of today’s neuro technologies that can access deeply personal information, invade private spaces, apply computational analysis, and even alter our brain functions via neurofeedback or neurostimulation. It is thus understandable that these instruments do not explicitly deal with issues such as requirements for accessing brain data, or for intervening with a wide variety of neuro technologies in ways that preserve autonomy and human dignity [17]. Although some of the articles of the UDHR, such as Article 18, do implicitly refer to the protection of the brain and mental sphere by protecting the right of “freedom of thought”, it does not deal with any specific risks brought about by neuro technologies or provide a holistic framework for the protection of the human brain and mind [15]. To provide some guidance in this regard, the Council of Europe, commissioned by their Committee on Bioethics, issued a report addressing “Common Human Rights Challenges raised by different applications of Neuro Technologies” in October 2021 [18]. Despite the fact that neuro rights are actively being considered by numerous civil society organizations, it is only Chile which has enshrined the right to neuroprotection in its national constitution and is currently busy developing a more detailed legislation for the protection of neuro rights [19]. The Human Rights Council in Geneva adopted a resolution in October 2022 to appoint an Advisory Committee to investigate and prepare a study on the impact, opportunities, and challenges of neuro technology with regard to the promotion and protection of all human rights, which report must be presented to the Council at its 57th session in September–October 2024 [20]. Comparing the rise of neuro technologies with the development of the atomic bomb and the creation of the International Atomic Energy Agency (IAEA) to regulate and act as an international control system of nuclear material to prevent the risk of a catastrophic war, similarly the creation of an international neuro technology agency by the United Nations (UN) to oversee the development of neuro technology within the human rights framework seems to be the ultimate goal [4]. It is very important to first establish what exactly neuro technologies can do, to ascertain how they may impact on human rights. For the time being, O’Shaughnessy et al. have identified three broad themes of relevance: (1) safety and privacy; (2) equity and justice; and (3) agency, autonomy, and identity [21].

2.1 Data privacy

Data derived, detected, or collected from one’s brain through the use of neuro technologies are arguably far more sensitive and intimate than other biometric or health data, because of the interpretation of one’s mental state based on such data, or inferences drawn about one’s future, including predictions of one’s future cognitive capacity and neurological illness [22]. The potential application of such “neuro data” has sparked interest in the evolving landscape of the global data economy [23]. But its implications, especially in the field of informational privacy, have raised significant concerns about the need for new ethical and regulatory frameworks [24]. Whilst some are of the opinion that existing laws, regulations, and ethical guidelines implicitly deal with neuro data, and do offer sufficient protection from the risks that neuro technologies pose to human rights [25]. However, existing laws that explicitly deal with health information and data privacy do not even cover standard uses of neuro data, let alone deal with the processing of sensitive neuro data. In this context, all data generated from neuro technology systems should be regarded as personal data as contemplated in Article 4 of the General Data Protection Regulation (GDPR), Article 1 of the 2013 Organization for Economic Co-operation and Development’s (OECD’s) Privacy Guidelines, and Article 2a of the Council of Europe’s Modernized Convention for the Protection of Individuals with regard to the Processing of Personal Data [26]. I shall thus consider prominent aspects of the regulation of neuro data as personal data as provided for in the GDPR.

When personal data are anonymized, the GDPR no longer applies, which means that the consent of the data subject is no longer required for the processing, including sharing of such data [17]. This is also true for neuro data. However, despite the technological advancements in anonymization, the potential for re-identification of any personal data is increasingly becoming a reality. More so in the case of brain data, due to the capabilities of the technologies involved in processing brain data which produce high informational richness and contextualization. In this regard, researchers have demonstrated that it is theoretically feasible to re-identify a person purely based on his or her electrophysiological measurements or neuroimaging data, which may even enable the prediction of current emotional states and future behavior, and it can decode sensitive information from either a person’s neural activity or his or her digital phenotypes [27].

Health data are defined in Article 4(15) of the GDPR as “data related to the physical or mental health of a natural person”. Consequently, data derived from a person’s brain or mind may be classified as health data, because of the inferences that can be made about a person’s physical and mental health. In addition, acknowledging that biometric data are so sensitive, that is, justify being classified as a “special category of data” in terms of the GDPR, neuro data similarly seem sufficiently sensitive to be regulated under the GDPR as biometric data. The influential Article 29 Working Party expanded these definitions by including information about a person’s intellectual or emotional capacity [28]. The sensitivity of such data as well as the fact that it has been grouped together with health data implicates that it requires the most extensive form of protection available. But not all brain data can always be considered as health data and treated as sensitive data as contemplated in the GDPR. Brain data generated by consumer neuro technologies may not qualify as “health data” and the devices generating such data may similarly not qualify for regulation under medical devices regulatory regimes, which leave such data with far less protection than data generated from clinical applications [29]. Brain data collected via any non-clinical neuro technologies are thus underprotected and vulnerable to violations of mental privacy.

Although some guidance is provided for certain uses of brain-based data by some healthcare privacy laws, the Health Insurance Portability and Accountability Act (HIPAA), for example, only regulates specific entities that process and store data, such as hospitals, and then limit those entities with regard to what they may do with such data, whilst entities other than those providing healthcare services, such as device makers (both medical and non-medical), are excluded from the regulatory ambit of HIPAA. Neuro data collected as consumer data will accordingly not be protected. Whilst fundamental rights against discrimination may provide consumers of neuro technologies outside of clinical use with partial protection, such data may still be used to gain cognitive insights into human behavior that may be used to design targeted marketing strategies [30]. In the wake of scientific advancements in the field of genetics, the US created the Genetic Information Non-discrimination Act (GINA) to protect this special kind of biological personal information which prohibits discrimination in the context of health insurance or employment based on genetic information [31]. Despite dealing exclusively with genetic information, this legislative development depicts the need for data-type specific legislation and may serve as an example of how to address unique scientific or technological developments and protect people’s fundamental rights in view thereof.

Article 5(1)(b) of the GDPR further provides for an exemption of its general rules when personal data are processed for research or statistical purposes. Considering that brain data may be obtained and processed under the auspices of research by private entities, such as consumer neuro technology companies, brain data may not be sufficiently protected in these circumstances. The concern is that brain data are (and will) increasingly be used by government agencies and consumer companies appealing to the research and statistical exemption provided for in the GDPR. Furthermore, some of the “research” activities conducted by these actors may also not be subject to Ethical Review Board approval, which in clinical research setting, usually serve as a safeguard for the protection of participant interest. For this reason, Ienca et al. advocate for greater transparency around the intended purposes of research, and to allow people to intervene in any further processing of their personal information if such processing contradicts their personal values or the purpose of this research is undesirable [26]. Transparency and description of the research purpose also closely links to the normative principle of purpose limitation. Sensitive personal information, such as health data, may only be collected and processed for clearly specified purposes, which must be clarified with the participant prior to, and to enable the participant to provide properly informed consent. In this regard, the GDPR explicitly provides for a purpose limitation that prohibits the further processing of personal data if such processing is incompatible with the specific, explicit, and legitimate purposes for which the data were collected for in the first place [32]. However, considering the methods, both existing and in the future, used by neuro technologies to obtain brain data, practical implementation of the requirement of purpose limitation may be exceptionally difficult due to the fact that neuro technologies cannot pre-emptively discern purpose-specific data from the myriad of brain signals recorded by the device (signal-to-noise problem), including subconscious processes. Large amounts of information that falls outside the ambit of the intended purpose will necessarily be collected collaterally with the purposefully targeted information, whilst the participant is, to a large extent, completely unaware of it.

Despite the fact that one can classify neuro data as biometric data in terms of the GDPR, in this context the GDPR primarily focuses on whether an individual can be identified based on this type of data, as opposed to considering the impact that inferences from a person’s neuro data about such an individual’s interests, preferences, health status, or psychology may have on that person. Further consideration in this regard may include how such inferences may be used to affect the decisions that people make online or how participants may be convinced to give consent, and in which circumstances such sensitive information may be sold to third parties for advertising or other purposes, or how content or experiences may be tailored which may cause additions in some individuals based on their psychological status, especially if brain data or inferences from it are used to influence people’s commercial, social, or political behavior. Also, current regulations do not protect people against discrimination based on neurological inferences, nor do the regulate the use of neuro data or neuro technologies by law enforcement.

Concepts such as “mental integrity” remain vague and broad, which may challenge courts when it comes to crafting orders for the enforcement of right relating to them (or not) [33]. Concerns around terminology were also raised during the drafting of the Chilean bill, referred to above, especially regarding the ontological and definitional concerns about the relationship between brain and mental states, and what counts as “neuro data” and whether it includes both patterns of neuronal activation and morphological data. The bill’s definition of neuro technologies further focuses on devices that read, record, or modify brain activity from the central nervous system (CNS), excluding any technologies that connect with the peripheral nervous system. Because this provision is so broadly drafted that it may necessarily also “protect” individuals to such an extent that it may interfere with established medical treatments, some Chilean scholars expressed their concerns about how the intended legislation may inadvertently negatively impact medical care and research, whilst trying to protect people’s neuro data [34]. These scholars argue that if these terms are interpreted literally, it may inadvertently negatively impact on effective treatments for neurological and psychiatric disorders and perhaps halt the development of some of the most promising neuro technology-based treatments [35]. Consequentially, without conceptual clarification and proper definitions for relevant terms, the regulation of neuro data and rights may cause its own demise and sabotaging its own goals by providing simultaneously too much and too little protection [33]. To adequately protect neuro data and related rights, relevant stakeholders must engage more deeply with one another to get conceptual clarity to ultimately enable proper implementation of such rights.

Naufel and Klein investigated how BCI researchers understood the relationship between neural data and BCI users, and what control, hey think, individuals should have over their neural data. They found that 58% of BCI researchers endorsed giving research participants access to their raw neural data at the conclusion of a study but simultaneously felt that those individuals should be limited in their freedom to donate or sell such data [36]. Additionally, the majority of these researchers considered raw neural data as a type of medical data and felt that existing laws and regulations are not providing adequate protection to consumer neural data privacy, despite many of these researchers being unfamiliar with the details of these laws and regulations.

2.2 Identity

Neuro technological devices that are implanted into a person’s brain have greater “read-write” capabilities and can typically sense and record information by measuring electrical activity from the brain [37]. Implanted neuro devices have also shown to improve functionality in patients with movement disorders such as Parkinson’s disease, tetraplegia, and other severe, previously untreatable disorders of mobility and communication [38, 39, 40]. Clinically speaking, these technologies are restoring basic functions to patients, but unexpected changes to personality, identity, and decision-making, as well as positive and negative impacts on a personal sense of agency have been reported [41]. To provide context to these reports, and to contemplate the impact of these technologies in future, one must understand how these technologies operate.

The first example of BCI technology that comes to mind is a device that provides a direct communication pathway between the brain’s electrical activity and an external device. This type of BCI is designed to restore or even enhance a person’s sensory mechanisms through which he or she perceives the world, including their muscular movements to react to it. New and further developments of this technology include technologies that can repair or augment the way our brains form and access memories via implanted devices. Also referred to as an artificial hippocampus or cognitive prosthesis, this device is designed to restore a person’s ability to form new long-term memories, particularly useful to patients suffering from Alzheimer’s disease [42]. Future and improved designs of these kinds of technologies may enhance and supplement, as opposed to only store normal cognitive function. Some even anticipate that future BCIs may have the capability to implant memories in a person’s mind of events they have never experienced [43].

Against this backdrop of predicted developments, the fundamental rights of people against the use of memory enhancing BCIs may partially be regulated by the constitutional protection for “freedom of thought” as debated in US courts, and enshrined in the European Convention on Human Rights, and EU Charter of Fundamental Rights. Kolber interprets this right to also include the right to “freedom of memory” that allow people to control the content of their memory, including when and what they share their memory with others [44]. Over and above concerns about privacy, personal security, and data sharing, such technologies also pose unique ethical-legal questions about augmentation without therapeutic purpose, the enhancement of memory capacity, allowing the artificial creation of memories, and the shaping or manipulation of memory content [43]. Investigations by Gilbert, Ienca, and Cook into the phenomenology of human-machine symbiosis during the first-in-human experimental BCI trial, designed to predict epileptic seizures in participants, found that after BCI implantation, the patient reported experiences of increased agential capacity and continuity, whilst after the BCI explantation, the patient reported persistent traumatic harms linked to agential discontinuity [45]. Because trials involving BCI implants and explants are still in its infancy, the impact of the presence and absence later on of these devices on the participant and his or her autonomy, identity, and sense of personal control must still be researched in depth.

2.3 Cybersecurity

Currently, there are no standards or specific protocols to guide the secure development of BCI technologies and applications, which situation poses a significant weakness to the BCI software and its interaction with the hardware [46]. These limitations do not provide security mechanisms to ensure the integrity of transmitted data or users’ privacy which may easily lead to a range of problems, such as the malfunction of the actions carried out by the BCI, the leaking of sensitive personal brain data, the lack of authentication mechanisms which can lead to any attacker impersonating a legitimate user, and to the adaption of BCI functionalities with malicious data [47]. For example, P300, one of the most well-known and used event-related potentials (ERPs) in brain recording, constitutes the measured brain response that is the direct result of a specific sensory, cognitive, or motor event. It has been discovered that ERPs used in neuroscience, cognitive psychology, cognitive science, and psycho-physiological can produce reliable ERPs from participants upon their exposure to many different sensory stimuli [48]. The timing of these responses has also been considered to be a measure of the timing of the brain’s communication or timing of information processing. In this regard, P300 in brain recording related to the visualization of stimuli known by the person produces between 250 and 500 ms after the visualization of each known stimulus and has a positive signal peak. Furthermore, one of the most common ways of provoking this potential is through the so-called Oddball paradigm, which entails the showing of a series of known stimuli that belong to a more extensive set of unknown stimuli. At this stage, the captured electroencephalogram (EEG) and the labeling of the P300 are susceptible to the user, which problem is aggravated due to the lack of frameworks that consider security aspects such as authentication, confidentiality, and data integrity, making it easy for attackers to carry out any malicious action via the BCIs [49].

These, and other similar very complex and technical risks and concerns raised by the use of BCIs and their possible implications for national security have sparked the US to consider placing export controls on BCIs [50]. Zanjani and Ghazizadeh are of the opinion that, given the current state of affairs and the projection of advances in BCI technologies, a “point of no return” may be reached if a wholesome and forward-looking alternate solution is not devised and implemented soon [51]. They predict that the current tenets in the information technology (IT) sector will evolve to swamp the consumer BCI sector, resulting in an ecosystem in which patches and ad hoc fixes will no longer be able to fix and restore long-lost end-user rights and ethical values. In this milieu, they propose that this ecosystem should rather be based on four principles: (1) openness; (2) modularity; (3) offline deployability; and (4) least privilege to create a safer environment with effective protection of neuro privacy, neuro security, and agency whilst facilitating supporting high-performance BCI technologies.

Advertisement

3. Out with the old, in with the new?

Since the adoption of the UDHR, advancements in genetics, particularly in genome sequences and gene editing, have forced the drafting of new declarations to protect human rights that became vulnerable for exploitation in light of these technological advancements. This has, for example, prompted the United Nations Educational, Scientific and Cultural Organization (UNESCO) to issue the International Declaration on Human Genetic Data in 2003. This declaration ascribes a “special status” to human genetic data based on its sensitive nature because “they can be predictive of genetic predispositions concerning individuals and that the power of predictability can be stronger than assessed at the time of deriving the data” [52]. Premised on this special status, the declaration prescribes certain conditions for the legitimate use of human genetic data, which includes free and informed consent, a prohibition against discrimination and stigmatization, and the protection of privacy and confidentiality. Regardless of the fact that brain data can also be considered to be “special” as contemplated in the GDPR, brain data, in contrast, have no special or explicit protections or guarantees, and lack comparable protection by human rights instruments.

Ironically, the use of the so-called neuro data is not entirely new. In the 1981 court case against John Hinckley, who was prosecuted for the attempted assassination of President Reagan, his defense team offered a brain scan as evidence to show that Hinckley suffered from Schizophrenia as indicated by an atrophied area in his brain [53]. J. Sherrod Taylor, who is considered to be the originator of the term neuro law, advocated for the creation of neuro law as a new medical jurisprudence as far back as 1995 already, which concept was also closely associated with the dynamic development of neuroscience, supported by the development of functional magnetic resonance imaging (fMRI) in the 1990s [54]. Similar to the development of disciplines such as neuro-theology, neuro-economics, neuro-ethics (2000), and neuro-philosophy (1986), neuro law has emerged as a discipline in which knowledge gained from brain functions was applied to provide insights into new criminal and antisocial behavior, which became known as criminology. Because of the questioning of fundamental issues of determinism and free will, for example, neuro law, contributed to the development of modern-day notions of crime and punishment.

Apart from not being fully protected under current privacy laws, no international humanitarian law or international treaty exists that protects neuro data from being applied in the context of dual use or the potential weaponization of neuro technologies for military purposes. As with legislation that specifically deals with human genetic data, specific disarmament treaties exist for biological weapons such as the 1972 Biological Weapons Convention (BWC) that regulates the weaponization of biological knowledge and technologies such as biological and toxin-based weapons by prohibiting their development, production, acquisition, transfer, stockpiling, and use. Except for the provisions contained in the Chilean Constitution, no specific regulations exist for neuro data. This is a problem that gets increasingly complex, considering that advancements in neuro technology and its applications are quickly outpacing international legal protections, revealing serious shortcomings in existing international human rights law [55]. These rapid technological advancements also challenge traditional methods of, and the standard regulation of data collection, accessing, sharing, and the manipulation of information received from a human brain. In turn, these challenges also force us to rethink the relationship between neuroscience and human rights in an effort to ensure adequate protection and prevent unintended consequences. Ienca and Adorno scrutinized the relationship between human rights and neuroscience and identified four new types of human rights that will become increasingly relevant in the future: (1) the right to cognitive liberty; (2) the right to mental privacy; (3) the right to mental integrity; and (4) the right to psychological continuity [56].

Up to now, thoughts have been considered to be inherently private but neuro technologies are changing this situation to the extent that users of BCI may in future develop meta-cognition [57]. This may cause prospective users of BCIs to consciously change their inner life and limit the scope of their thoughts due to fear of exposure, which will necessarily impact their personality, sense of identity, cognitive capabilities, intelligence, creativeness, and fantasies [58]. Based on these anticipated technological intrusions, the concept of freedom of thought has been promoted by legal scholars as a further expansion of the idea of congnitive liberty, which was defined by Sententia as the “freedom of thought” to control, monitor, and manipulate our own cognitive functions [59]. Although this right aims to protect people from undesired influences and more specifically from interference by the state and third parties, it also entails the right to alter one’s mental states with the help of neuro tools as well as to refuse to do so [60].

From the privacy perspective, Finn, Wright, and Friedewald identified seven types of privacy that need to be considered when people interact with new and emerging technologies: (1) privacy of the person; (2) privacy of behavior and action; (3) privacy of personal communication; (4) privacy of data and image; (5) privacy of thought; (6) privacy of location and space; and (7) privacy of association [61]. In addition, Frank and Pratte have predicted that mental privacy will be increasingly challenged in future in the absence of any legal precedents having to deal with the reality of a technology that has the ability to look into the mind of another human being [62]. Such technologies and their imminent evolution to present even more sophisticated technologies with even greater capabilities will pose many more and different ways in which they may violate a person’s mental privacy, which situation clearly necessitates sui generis legal protection.

At the moment, a person’s right to both physical and mental integrity enjoys protection in terms of Article 3 of the EU Charter of Fundamental Rights which states that “everyone has the right to respect for his or her physical and mental integrity” [63]. This separation between physical and mental integrity was based on Descarte’s philosophy of body-mind dualism that viewed the body and mind as distinct and separable. Almost all legal systems have embraced this philosophy and embedded this dualism into their laws, which resulted in the systematic protection of bodies and brains, as opposed to minds and mental states. This dualism has never been in question during the origin of most (if not all) legislation. Mental integrity has predominantly been considered as a mental health issue and approached from a psychological or a psychiatric perspective [63]. Emerging neuro technologies have now disrupted this school of legislative thought and legal scholars are advocating not only for an expansion of the existing protection of mental integrity under the auspices of mental health, but also for the creation of more targeted legal protection that will protect people’s mental sphere from harm [60].

Brain-Computer Interface may also directly cause unintended changes to a person’s mental state which in turn will influence that person’s personality, his or her identity, and consequently also his or her autonomy. A right to psychological continuity could provide specific normative protection from potential neuro technology-enabled interventions and any unauthorized modification to a person’s neural computation abilities [60].

In combination, the aim of these novel or sui generis human rights is to protect the identity and brain functions of a person from unwanted external influences and to protect people from the possible abuse of technologies that may manipulate neural activity. I, thus, agree with the findings of Ienca and Andorno that existing human rights are necessary but may not be normatively sufficient or sufficiently agile to respond to the emerging issues raised by neuro technology, especially considered in view of the above discussions [55]. The current situation requires an urgent reconceptualization of certain human rights, and the creation of new rights to protect people from future potential harm [64]. This and other growing calls for so-called special brain rights have led to a couple of initiatives across the globe such as the Neuro Rights Initiative that was formed in the US in 2019, the issuing of the Digital Rights Charter in Spain which includes specific neuro technology provisions, and the proposed amendment to Brazil’s General Personal Data Protection Law (2018) to protect neuro data [65, 66, 67, 68, 69]. The common denominator on which most advocates for neuro rights generally agree is that they believe that neuro data qualify as a special category of information or data that are inextricably connected to a person’s identity and agency, which serve as the basis for all other fundamental rights [70].

These initiatives and joint deliberations with leading international scholars and policymakers led to the approval of a reform of Article 19 of the Chilean Constitution by the Chilean Congress in December 2020 to include the right to neuroprotection [71], which constitutional amendment was signed into law by their president in October 2021 [72]. Consequential to this constitutional development, the Chilean legislature developed a neuroprotection bill to implement Article 19 and related rights, which included mental privacy, personal integrity, self-determination, and equal access to enhancing neuro technologies, which bill was approved by the Senate in December 2021 and is currently being considered by the House of Representatives [73].

This step by the Chilean Congress was a giant leap for the global acknowledgement of neuro rights but was met with diverse reactions. Some scholars questioned the need for constitutional reform and/or a neuroprotection bill [74, 75], whilst others were largely supportive of regulation to address the challenges stemming from neuro technologies, despite their reservations about the need for constitutional reform [76, 77]. Regardless of these civil and scholarly debates around constitutional reform and the necessity of a neuro rights bill, there seems to be consensus that the neuroprotection bill would benefit from further legal and conceptual clarification prior to implementation. For example, the latest version of the bill only provides for neuro rights in the context of healthcare and completely omits commercial or other uses of neuro technologies. As discussed above, many people may use BCIs as consumers outside the healthcare setting and require equal protection of their sensitive neuro data against inferences derived from collected neuro data which may negatively impact or limit users’ experience. To develop effective and realistic regulations in this regard, the private sector, which is mainly responsible for the development of many of these technologies, must be engaged to actively participate in co-creating a neuro rights framework that leaves room for ethical technology innovation in the non-medical space. In addition to the private sector, citizens and community members who will be subjected to these new rights must also be involved or consulted. The participation of the public in conversations about neuro rights can inform key notions about issues such as integrity and harm in the context of neuro technology and neuro data and can influence the criteria used to assess the potential risks and benefits of having additional “neurorights” and whether individuals should be free to choose or reject these new rights [78]. In this regard, many documents [79, 80, 81] derive their principles and recommendations from Responsible Research and Innovation (RRI), a science policy framework popularized in the European Union (EU) that emphasizes the socially, ethically, and politically interconnected nature or science and technology which dictates that science policy must engage with and be informed by communities for communities. Although the goal of each guiding document is to benefit the public, each document describes different strategies for achieving this, some focus on the public interest, whilst others focus on scientific enterprise and the field of medicine, and still others focus on protecting the public from potential harms [82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92].

Advertisement

4. Conclusion

The flurry of ethical debates and recent active debates on neuro rights is not only evolving in response to rapid developments in the fields of neuroscience and neuro technologies but it is also an indication of the inherent fears and concerns that people harbor about the unprecedented impacts these scientific and technological developments may have at both an individual and societal level. To create a trust environment in which these technologies can advance to benefit humankind, an urgent clarification of the concepts and terminology relating to neuroscience and neuro technologies is required to enable the development of neuro rights governance proposals. Consequentially, considering the increasingly advanced capabilities of BCIs, it is abundantly clear that they present humankind with invasive abilities never encountered before, which begs the creation of new or sui generis human rights such as (1) the right to cognitive liberty; (2) the right to mental privacy; (3) the right to mental integrity; and (4) the right to psychological continuity. Mere stricter enforcement of existing fundamental rights or even the liberal interpretation of existing human rights is simply not sufficient enough to cover the unique spaces that new neuro technologies are entering. The critical mass that neuro rights reached in Chile and their governmental response thereto by means of implementing constitutional changes may very well pave the way for future neuro rights regulatory frameworks in the rest of the world, much like how the EU GDPR almost served as a guide for the development of data privacy legislation across the globe. Even the UN expressed their interest in considering “frontier issues” such as human rights governance related to digital and technology spaces, including neuro technology in the recently released Common Agenda [79]. The world is paying attention and the technologies are successfully soaring. The time to carefully consider its consequences on humanity, both on an individual and on a population level, is now.

Advertisement

Acknowledgments

This work has been funded by the Luxembourg National Research.

Fund (FNR)—IS/14717072 “Deceptive Patterns Online (Decepticon)”; and the European Union’s Horizon 2020 Innovative Training Networks, Legality Attentive Data Scientists (LEADS) under Grant Agreement ID 956562.

Advertisement

Conflict of interest

The author declares no conflict of interest.

References

  1. 1. Lebedev MA, Nicolelis AL. Brain-machine interfaces: From basic science to neuro prostheses and neurorehabilitation. Physiological Reviews. 2017;97(2):767-837. DOI: 10.1152/physrev.00027.2016
  2. 2. Edwards C, Kouzani A, Lee KH, Ross EK. Neurostimulation devices for the treatment of neurologic disorders. Mayo Clinic Proceedings. 2017;92(9):1427-1444. DOI: 10.1016/j.mayocp.2017.05.005
  3. 3. Thorpe S. Majority of Readers Believe Artificial Intelligence is Developing Too Fast [Internet]. 2023. Available from: https://www.telegraph.co.uk/news/2023/05/19/artificial-intelligence-developing-too-fast-telegraph/ [Accessed: May 30, 2023]
  4. 4. Smith N. How AI is Learning to Read the Human Mind [Internet]. 2023. Available from: https://www.telegraph.co.uk/global-health/science-and-disease/ai-artificial-intelligence-brain-research/ [Accessed: May 30, 2023]
  5. 5. Field M. We have Put the World in Danger with Artificial Intelligence, Admits ChatGPT Creator [Internet]. 2023. Available from: https://www.telegraph.co.uk/business/2023/05/16/chatgpt-creator-sam-altman-admits-world-in-danger/ [Accessed: May 30, 2023]
  6. 6. Simon AJ, Bernstein A, Hess T, Ashrafiuon H, Devilbiss D, Verma A. P1-112: A brain computer interface to detect Alzheimer’s disease. Alzheimer’s & Dementia. 2011;7:S145-S146. DOI: 10.1016/j.jalz.2011.05.391
  7. 7. Wang Y, Liu S, Wang H, et al. Neuron devices: Emerging prospects in neural interfaces and recognition. Microsystems & Nanoengineering. 2022;8:128. DOI: 10.1038/s41378-022-00453-4
  8. 8. Rabbani Q , Milsap G, Crone NE. The potential for a speech brain–computer interface using chronic electrocorticography. Neurotherapeutics. 2019;16:144-165. DOI: 10.1007/s13311-018-00692-2
  9. 9. Al-Nuaimi FA, Al-Nuaimi RJ, Al-Dhaheri SS, Ouhbi S, Belkacem AN. Mind drone chasing using EEG-based brain computer interface. In: 16th International Conference on Intelligent Environments (IE); Madrid, Spain. 2020. pp. 74-79. DOI: 10.1109/IE49459.2020.9154926
  10. 10. Wang C, Wu X, Wang Z, Ma Y. Implementation of a brain-computer interface on a lower-limb exoskeleton. IEEE Access. 2018;6:38524-38534. DOI: 10.1109/ACCESS.2018.2853628
  11. 11. Binnendijk A, Marler T, Bartels EM. Brain-Computer Interfaces: U.S. Military Applications and Implications, An Initial Assessment. Santa Monica, CA: RAND Corporation; 2020. Available from: https://www.rand.org/pubs/research_reports/RR2996.html [Accessed: May 30, 2023]
  12. 12. Milligan K, Balwani A, Dyer E. Brain mapping at high resolutions: Challenges and opportunities. Current Opinion in Biomedical Engineering. 2019;12:126-131. DOI: 10.1016/j.cobme.2019.10.009
  13. 13. Saboor A, Gembler F, Benda M, Stawicki P, Rezeika A, Grichnik R, et al. A browser-driven SSVEP-based BCI web speller. IEEE International Conference on Systems, Man, and Cybernetics (SMC). 2018;2018:625-630. DOI: 10.1109/SMC.2018.00115
  14. 14. Pais-Vieira M, Lebedev M, Kunicki C, Wang J, Nicolelis MAL. A brain-to-brain Interface for real-time sharing of sensorimotor information. Scientific Reports. 2013;3(1):1319. DOI: 10.1038/srep01319
  15. 15. United Nations Universal Declaration of Human Rights (UDHR). [Internet]. 1948. Available from: https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf [Accessed: August 27, 2023]
  16. 16. International Covenant on Civil and Political Rights (ICCPR). [Internet]. 1948. Available form: https://www.ohchr.org/en/instruments-mechanisms/instruments/international-covenant-civil-and-political-rights [Accessed: August 27, 2023]
  17. 17. Ienca M. On neurorights. Frontiers in Human Neuroscience. 2021;15:701258. DOI: 10.3389/fnhum.2021.701258
  18. 18. Council of Europe. Common human rights challenges raised by different applications of neurotehnology in the biomedical field. [Internet]. 2021. Available from: https://rm.coe.int/report-final-en/1680a429f3 [Accessed: August 27, 2023]
  19. 19. Chile Constitution (rev. 2021). 1980. Available from: https://www.constituteproject.org/constitution/Chile_2021?lang=en
  20. 20. United Nations. Meeting Summaries [Internet]. 2022. Available from: https://www.ungeneva.org/en/news-media/meeting-summary/2022/10/le-conseil-des-droits-de-lhomme-adopte-dix-neuf-resolutions-une [Accessed: May 31, 2023]
  21. 21. O’Shaughnessy M, Johnson WG, Johnson WG, Tournas L, Rozell C, Rommelfanger K. Neuroethics guidance documents: Principles, analysis, and implementation strategies. SSRN. 2022;4035992. DOI: 10.2139/ssrn.4035992
  22. 22. Ahlgrim NS et al. Prodromes and preclinical detection of brain diseases: Surveying the ethical landscape of predicting brain health. eNeuro. 2019;2019:1-11
  23. 23. Pustilnik AC. The Perils of Opening the Mind. Boston Globe. [Internet]. 2020. www.bostonglobe.com/2020/02/23/opinion/perils-opening-mind [Accessed: May 31, 2023]
  24. 24. Purcell RH, Rommelfanger KS. Internet-based brain training games, citizen scientists, and big data: Ethical issues in unprecedented virtual territories. Neuron. 2015;86:356-359. DOI: 10.1016/j.neuron.2015.03.044
  25. 25. Rommelfanger KS, Pustilnik A, Salles A. Mind the gap: Lessons learned from neurorights. Science and Diplomacy. [Internet]. 2022. DOI: 10.1126/scidip.ade6797. Available from: https://www.sciencediplomacy.org/article/2022/mind-gap-lessons-learned-neurorights [Accessed: August 27, 2023]
  26. 26. Ienca M. Common human right challenges raised by different applications of neuro technologies in the biomedical field. Report commissioned by the Committee on Bioethics (DH-BIO) of the Council of Europe. [Internet]. 2021. Available from: https://rm.coe.int/report-final-en/1680a429f3 [Accessed: May 31, 2023]
  27. 27. Schwarz N, Uysal B, Welzer M, Bahr JC, Layer N, Löffler H, et al. Long-term adult human brain slice cultures as a model system to study human CNS circuitry and disease. eLife. 2019;8:e48417. DOI: 10.7554/eLife.48417
  28. 28. Kohnstamm J. Advice paper on special categories of data (“Sensitive Data”). Brussels: Article 29 Data Protection Working Party [Internet]. 2011. Available from: https://ec.europa.eu/justice/article-29/documentation/other-document/files/2011/2011_04_20_letter_artwp_mme_le_bail_directive_9546ec_annex1_en.pdf [Accessed: May 31, 2023]
  29. 29. Rainey S, McGillivray K, Akintoye S, Fothergill T, Bublitz C, Stahl B. Is the European data protection regulation sufficient to deal with emerging data concerns relating to neurotechnology? Journal of Law and the Biosciences. 2020;7(1):lsaa051. DOI: 10.1093/jlb/lsaa051
  30. 30. Srivastava G, Bag S. Modern-day marketing concepts based on face recognition and neuro-marketing: A review and future research directions. Benchmarking: An International Journal. [Internet]. 2023. Available from: https://www.emerald.com/insight/content/doi/10.1108/BIJ-09-2022-0588/full/html [Accessed: August 27, 2023]
  31. 31. Genetic Information Non-Discrimination Act (GINA). [Internet]. 2014. Available from: https://www.genome.gov/genetics-glossary/Genetic-Information-Nondiscrimination-Act [Accessed: August 27, 2023]
  32. 32. General Data Protection Regulation (GDPR), Article 5(1)(b). [Internet]. 2016. Available from: https://gdpr-info.eu/ [Accessed: August 27, 2023]
  33. 33. Akmazoglu TB, Chandler JA. Mapping the emerging legal landscape for Neuroprostheses: Human interests and legal resources. In: Hevia M, editor. Regulating Neuroscience: Transnational Legal Challenges. Cambridge, MA: Academic Press; 2021. pp. 63-98
  34. 34. Ruiz S, Ramos-Vergara P, Concha R, Altermatt F, Von-Bernhardi R, Cuello M, et al. Negative effects of the patients’ rights law and neuro-rights bill in Chile. Revista Médica de Chile. 2021;149(3):439-446. DOI: 10.4067/s0034-98872021000300439
  35. 35. Lopez-Silva P, Madrid R. Protecting the mind: An analysis of the concept of the mental in the neurorights law. Revista De Humanidades De Valparaíso. 2022;20:101-117. DOI: 10.22370/rhv2022iss20pp101-117
  36. 36. Naufel S, Klein E. Brain–computer interface (BCI) researcher perspectives on neural data ownership and privacy. Journal of Neural Engineering. 2020;17(1):016039. DOI: 10.1088/1741-2552/ab5b7f
  37. 37. Drew L. The ethics of brain-computer interfaces. Nature. 2019;571:S19-S21. DOI: 10.1038/d41586-019-02214-2
  38. 38. McFarland DJ et al. Therapeutic applications of BCI technologies. Brain-Computer Interfaces. 2017;4:37-52. DOI: 10.1080/2326263x.2017.1307625
  39. 39. Wolpaw JR et al. Independent home use of a brain-computer Interface by people with amyotrophic lateral sclerosis. Neurology. 2018;91:e258-e267. DOI: 10.1212/wnl.0000000000005812
  40. 40. Willett FR et al. High-performance brain-to-text communication via handwriting. Nature. 2021;2021:249-254. DOI: 10.1038/s41586-021-03506-2
  41. 41. Mridha MF, Das SC, Kabir MM, Lima AA, Islam MR, Watanobe Y. Brain-computer interface: Advancement and challenges. Sensors (Basel). 2021;21(17):5746. DOI: 10.3390/s21175746
  42. 42. Berger TW, Song D, Chan RH, Marmarelis VZ, LaCoss J, Wills J, et al. A hippocampal cognitive prosthesis: Multi-input, multi-output nonlinear modeling and VLSI implementation. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2012;2012:198-211. DOI: 10.1109/TNSRE.2012.2189133
  43. 43. Blitz MJ, Barfield W. Memory enhancement and brain–computer interface devices: Technological possibilities and constitutional challenges. In: Dubljević V, Coin A, editors. Policy, Identity, and Neurotechnology. Advances in Neuroethics. Cham: Springer; 2023. DOI: 10.1007/978-3-031-26801-4_12
  44. 44. Kolber AJ. Freedom of Memory Today. Neuroethics. 2008;1:145. San Diego Legal Studies Paper No. 09-010. Available from: https://ssrn.com/abstract=1374026
  45. 45. Gilbert F, Ienca M, Cook M. How I became myself after merging with a computer: Does human-machine symbiosis raise human rights issues? Brain Stimulation. 2023;16(3):783-789. doi: 10.1016/j.brs.2023.04.016
  46. 46. Beltrán ETM, Pérez MQ, Bernal SL, Celdrán AH, Pérez GM. SecBrain: A Framework to Detect Cyberattacks Revealing Sensitive Data in Brain-Computer Interfaces. In Advances in Malware and Data-Driven Network Security [Internet]. 2022. Available from: https://www.igi-global.com/chapter/secbrain/292237 [Accessed: August 27, 2023]
  47. 47. Wilson JA, Guger AC, Gerwin S. BCI hardware and software. In: Wolpaw J, Wolpaw EW, editors. Brain–Computer Interfaces: Principles and Practice. Oxford: Oxford Academi; 2012. DOI: 10.1093/acprof:oso/9780195388855.003.0009
  48. 48. McCormick B. Your thoughts may deceive you: The constitutional implications of brain fingerprinting technology and how it may Be used to secure our skies. Law & Psychology Review. 2006;30:171-184
  49. 49. Luck SJ, Kappenman ES, editors. The Oxford Handbook of Event-Related Potential Components. London, United Kingdom: Oxford University Press; 2012:664. ISBN 9780195374148
  50. 50. Borman M. Request for comments concerning the imposition of export controls on certain brain-computer interface (BCI). Emerging Technology and Federal Registration. 2021;86:59070-59073
  51. 51. Zanjani SM, Ghazizadeh A. Reclaiming the utopia: Alternate ecosystems for safeguarding human rights in the high-performance brain machine Interface era. Addiction. 2021;26:27. DOI: 10.13140/RG.2.2.14937.44644
  52. 52. UNESCO. International Declaration on Human Genetic Data [Internet]. 2023. Available from: https://www.unesco.org/en/legal-affairs/international-declaration-human-genetic-data?hub=66535 [Accessed: June 1, 2023]
  53. 53. Dartigues L. An irresistible ascent? The neurolaw and its critics. Zilsel 2018;1(3):63-103. Available from: https://www.cairn.info/revue-zilsel-2018-1-page-63.htm?try_download=1 [Accessed: June 1, 2023]
  54. 54. Taylor JS. Neurolaw: Towards a new medical jurisprudence. Brain Injury. 1995;9:745-751. DOI: 10.3109/02699059509008230
  55. 55. Ienca M, Andorno R. Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences Society and Policy. 2017;13:1-27. DOI: 10.1186/s40504-017-0050-1
  56. 56. Weinberger S, Greenbaum D. Are BMI prosthetics uncontrollable Frankensteinian monsters? Brain-Computer Interfaces. 2016;3(3):149-155. DOI: 10.1080/2326263X.2016.1207495
  57. 57. Fleur DS, Bredeweg B, van den Bos W. Metacognition: Ideas and insights from neuro- and educational sciences. Science Learning. 2021;6:13. DOI: 10.1038/s41539-021-00089-5
  58. 58. Krausová A. Legal aspects of brain-computer interfaces. Masaryk University Journal of Law and Technology. 2014;8(2):199-208. Available from: https://journals.muni.cz/mujlt/article/view/2655
  59. 59. Sententia W. Neuroethical considerations: Cognitive liberty and converging technologies for improving human cognition. Annals of the New York Academy of Sciences. 2004;1013:221-228. DOI: 10.1196/annals.1305.014
  60. 60. Bublitz JC. My Mind is Mine!? Cognitive Liberty as a Legal Concept. In: Hildt E, Francke A, editors. Cognitive Enhancement. Mainz, Germany: Springer; 2013. Chapter 19, 233-264
  61. 61. Finn RL, Wright D, Friedewald M. Seven types of privacy. In: Gutwirth, Leenes S, de Hert R, Poullet P, editors. European Data Protection: Coming of Age. Dordrecht: Springer; 2013. DOI: 10.1007/978-94-007-5170-5_1
  62. 62. Tong F, Pratte MS. Decoding patterns of human brain activity. Annual Review of Psychology. 2012;63:483-509. DOI: 10.1146/annurev-psych-120710-100412
  63. 63. European Commission. Charter of Fundamental Rights of the European Union. (2012/C 326/02). Official Journal of the European Union. [Internet]. 2012. Available from: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:12012P/TXT [Accessed: August 27, 2023]
  64. 64. Ienca M. The right to cognitive liberty. Scientific American. 2017;317(2):10. DOI: 10.1038/scientificamerican0817-10
  65. 65. Bublitz JC, Merkel R. Crimes against minds: On mental manipulations, harms, and a human right to mental self-determination. Criminal Law and Philosophy. 2014;8:51-77. DOI: 10.1007/s11572-012-9172-y
  66. 66. Yuste R et al. Four ethical priorities for neurotechnologies and AI. Nature. 2017;551:159-163. DOI: 10.1038/551159a
  67. 67. Lavazza A. Freedom of thought and mental integrity: The moral requirements for any neural prosthesis. Frontiers in Neuroscience. 2018;12:1-10. DOI: 10.3389/fnins.2018.00082
  68. 68. Sommaggio P, Mazzocca M. Cognitive liberty, and human rights. In: D’Aloia A, Errigo MC, editors. Neuroscience and Law: Complicated Crossings and New Perspectives. Cham: Springer International Publishing; 2020:95-111
  69. 69. de Espana G. Carta Derechos Digitales. 2021. Available from: www.lamoncloa.gob.es/presidente/actividades/Documents/2021/140721-Carta_Derechos_Digitales_RedEs.pdf
  70. 70. Bublitz JC. Novel neurorights: From nonsense to substance. Neuroethics. 2022;15:7. DOI: 10.1007/s12152-022-09481-3
  71. 71. Senado, Republica de Chile, Boletin 13.827-19 (2020), Modifica el Artículo 19, número 1°, de la Carta Fundamental, para Proteger la Integridad y la Indemnidad Mental con Relación al Avance de las Neurotecnologías. [Internet]. 2020. Available from: https://www.diarioconstitucional.cl/wp-content/uploads/2021/09/Boletin-13827-19-neuroderechos.pdf [Accessed: August 27, 2023]
  72. 72. Ministerio de Ciencia, Technologia, Conocimiento e Innovacion. Ley21383. Modifica la Carta Fundamental, Para Establecer el Desarrollo Científico y Tecnológico al Servico de las Personas. [Internet]. 2021. Available from: https://www.bcn.cl/leychile/navegar?idNorma=1166983 [Accessed: August 27, 2023]
  73. 73. Senado, Republica de Chile, Boletin 13.828-19 (2020), Sobre Protección de los Neuroderechos y la Integridad Mental, y el Desarrollo de la Investigación y las Neurotecnologías. [Internet]. 2020. Available from: https://www.senado.cl/noticias/neuroderechos/neuroderechos-aprueban-ideas-matrices-destacando-la-dignidad-humana#:~:text=Proyecto%20de%20reforma%20constitucional%20que%20modifica%20el%20art%C3%ADculo,de%20su%20identidad%20individual%20y%20de%20su%20libertad [Accessed: August 27, 2023]
  74. 74. Fajuri AZ, et al. ¿Neuroderechos? Razones para no Legislar. Ciper Académico. [Internet]. 2020. Available from: https://www.ciperchile.cl/2020/12/11/neuroderechos-razones-para-no-legislar/ [Accessed: August 27, 2023]
  75. 75. Ruiz S et al. Efectos Negativos en la Investigación y el Quehacer Médico en Chile de la Ley 20.584 y la Ley de Neuroderechos en Discusión: La Urgente Necesidad de Aprender de Nuestros Errores. Revista Médica de Chile. 2021;149:439-446
  76. 76. López-Silva P. Ley de Neuroderechos, el Concepto de la Mente y el Escenario de la Investigación en Neurociencias. El Mostrador [Internet]. 2021. Available from: https://www.elmostrador.cl/noticias/opinion/columnas/2021/06/08/ley-de-neuroderechos-el-de-concepto-de-la-mente-y-el-escenario-de-la-investigacion-en-neurociencias [Accessed: August 27, 2023]
  77. 77. López-Silva P, Madrid R. Sobre la Conveniencia de Incluir los Neuroderechos en la Constitución o en la Ley. Revista Chilena de Derecho y Tecnología. 2021;10:53-76. DOI: 10.5354/0719-2584.2021.56317
  78. 78. Inglese S, Lavazza A. What should we do with people who cannot or do not want to Be protected from Neurotechnological threats? Frontiers in Human Neuroscience. 2021;15:1-6. DOI: 10.3389/fnhum.2021.703092
  79. 79. Nuffield Council on Bioethics. Novel Neurotechnologies: Intervening in the Brain. [Internet]. 2013. Available from: https://www.nuffieldbioethics.org/publications/neurotechnology [Accessed: August 27, 2023]
  80. 80. OECD. Recommendation of the Council on Responsible Innovation in Neurotechnology tech. rep. OECD/LEGAL/0457. [Internet]. 2019. Available from: https://www.oecd.org/science/emerging-tech/responsible-innovation-in-neurotechnology.pdf#:~:text=adopted%20on%2011%20December%202019%20-%20OECD%2FLEGAL%2F0457%20The,health-related%20neurotechnologies%20while%20promoting%20innovation%20in%20the%20field [Accessed: August 27, 2023]
  81. 81. Ienca M, et al. Towards a Governance Framework for Brain Data. arXiv: 2109.11960 [cs, q-bio]. 2021. doi: 10.48550/arXiv.2109.11960
  82. 82. Global Brain Workshop. Grand Challenges for Global Brain Sciences. [Internet]. 2016. Available from: https://papers.cnl.salk.edu/PDFs/Grand%20Challenges%20for%20Global%20Brain%20Sciences%202017-4507.pdf [Accessed: August 27, 2023]
  83. 83. The Japanese Neuroscience Society. Guidelines for ethics-related problems with non-invasive research on human brain function. [Internet]. 2019. Available from: https://www.jnss.org/en/human_ethic?u=b4629b99001a9c9f7eb938f857dc2943 [Accessed: August 27, 2023]
  84. 84. CeReB: The Center for Responsible Brainwave Technology. The Ethics of Brain Wave Technology. [Internet]. 2014. Available from: https://static1.squarespace.com/static/5344501be4b0d532fc42e22f/t/5390ceece4b0fe2199de93cc/1401999084766/the+ethics+of+brainwave+technology.pdf [Accessed: August 27, 2023]
  85. 85. Nuttin B et al. Consensus on guidelines for stereotactic neurosurgery for psychiatric disorders. Journal of Neurology, Neurosurgery & Psychiatry. 2014;85:1003-1008. DOI: 10.1136/jnnp-2013-306580
  86. 86. U.S. Presidential Commission for the Study of Bioethical Issues Gray Matters: Topics at the Intersection of Neuroscience, Ethics, and Society. 2014;1
  87. 87. U.S. Presidential Commission for the Study of Bioethical Issues Gray Matters: Topics at the Intersection of Neuroscience, Ethics, and Society. 2015;2
  88. 88. Rommelfanger KS et al. Neuroethics questions to guide ethical research in the international brain initiatives. Neuron. 2018;100:19-36. DOI: 10.1016/j.neuron.2018.09.021
  89. 89. Greely HT et al. Neuroethics guiding principles for the NIH BRAIN initiative. The Journal of Neuroscience. 2018;8:10586-10588. DOI: 10.1523/JNEUROSCI.2077-18.2018
  90. 90. The Royal Society. iHuman: Blurring Lines between Mind and Machine. 2019
  91. 91. NIH Advisory Committee to the Director (ACD). Working Group on BRAIN 2.0 Neuroethics Subgroup (BNS). The BRAIN Initiative and Neuroethics: Enabling and Enhancing Neuroscience Advances for Society. 2019
  92. 92. United Nations. Our Common Agenda: Report of the Secretary-General. 2021. Available from: www.unep.org/resources/report/our-common-agenda-report-secretary-general

Written By

Marietjie Botes

Submitted: 02 June 2023 Reviewed: 01 August 2023 Published: 23 November 2023