Neurotechnology is growing so fast that related ethical concerns cannot be neglected anymore. Facebook announced that they are working on brain-computer interface (BCI) that enable the users to post directly from their mind (Cappella, 2017). Elton Musk said that he aims to develop BCIs that make telepathic communications real (Begley, 2017). In a recent study at University of Toronto Scarborough, researchers succeed in extracting the perceives image of a human’s face by analysing the brain’s activity and finding specific patterns. Further, they try to extract the pattern of words and text. “Development of mind reading technologies seems as a matter of time, not a matter of if” as Adrian Nestor, the coauthor of the study, says (Carr, 2015). Using these technologies outside the medical settings and just for people with disabilities has been the source of concerns. Although these advancements have the potential to help people in many conditions and boost the capacities of human mind, their manipulation power is quite intimidating. How would it be like walking in the street while you are not the only person who access your mental life? How would it change human characteristics and identity?
Autonomy and agency is one area that can arise ethical concerns about these technologies. The combination of mind-reading technologies and AI can end up with autofill systems that recognize person’s intentions and desires and automatically fill the gap between thoughts and actions. devices that can be controlled telepathically are an example of this function (Begley, 2017; Gent, 2017). In fact, scientists in Germany designed sensors that recognize the intentions of the user for motion commands precisely and using this system for communication with robots that work as assistantsint. The possibility of mind manipulation and memory implantation by mind stimulating systems complicate the agency issue. Cognitive enhancement devices that act semi-autonomously raise concerns about person’s identity (Begley, 2017). They also bring up the discrimination and accessibility arguments for these augmentation technologies that can improve the mental and sensory abilities (Gent, 2017).
Privacy and consent issues are the most obvious side effect of these advancements. Advertisements can exploit people based on information extracted from their brain in each state of their minds. Google and Disney are already using neuromarketing to understand the preferences and reaction of their customers to their advertisements (Lenca & Andorno, 2017). The proliferation of the neurodevices for different purposes like self-monitoring, meditation, etc… encourage people to share their information in the same way of their photos and other personal information.
Cognitive liberty is one of the rights that prevents the political and industrial actors from forcing people to use these technologies without their consent. Mental privacy let individuals decide when and how their mental information would be shared. Mental integrity is a very important right that protects humans against the unconsented manipulations even advertisements that try to convince users and change their behavior in their interest (Lenca & Andorno, 2017).
A group of representatives, including various researchers who involve in the neurological technology studies and experts from technological companies that invest in the advancement of these systems had a discussion in 2017 about the ethical aspects of neurotechnology. They published the result of this debate in the Nature journal. The authors of this report assert that neurorights should be provided as an international agreement or a “Universal declaration of human rights”. They argued that banning the use and development of these technologies cannot be a solution because it leads to underground activities, but educating people about the possible consequences of using and adopting these technologies increases wiser decisions at the personal and governmental levels (Gent, 2017).
References:
Lenca, M., & Andorno, R. (2017, April 26). A new category of human rights: neurorights. Retrieved from https://blogs.biomedcentral.com/bmcblog/2017/04/26/new-category-human-rights-neurorights/
Begley, S. (2017, November 8). Brain technologies pose threats that are all too real, experts warn. Retrieved from https://www.statnews.com/2017/11/08/brain-technologies-ethics/
Carr, D. (2015, March). The mind-blowing future of mind reading. Retrieved from http://www.cbc.ca/radio/thecurrent/the-current-for-march-15-2018-1.4576392/the-mind-blowing-future-of-mind-reading-which-may-be-closer-than-you-think-1.4576518
Cappella, N. (2017, April 27). Researchers advocate neurorights to protect brain data. Retrieved from https://thestack.com/big-data/2017/04/27/researchers-advocate-neurorights-to-protect-brain-data/
Gent, E. (2017, December 6). Scientists Call Out Ethical Concerns for the Future of Neurotechnology. Retrieved from https://singularityhub.com/2017/11/21/scientists-lay-out-urgent-ethical-concerns-for-the-future-of-neurotechnology/#sm.0001j4auoohb6ej4qkn26fuom4p3u
Add yours Comments – 1
[…] http://hennessy.iat.sfu.ca/wp/stc2018/2018/04/13/neurorights-will-our-private-thoughts-stay-private/ […]