A venture by Elon Musk, Neuralink, will implant electrodes into the brain for direct computer interfacing with the goal of keeping humans economically viable in the digital age.
These technologies would directly tap the brain to “read out” thoughts, improve memory, and help with decision-making. Sharing “full sensory and emotional experiences” online may be the next big thing in social networking.
Prosthesis devices to replace damaged brain regions may become available.
Currently available are external devices which use a combination of EEG and clever algorithms to provide a “mouse” for the brain (Neurable) and medical implants which allow the deaf to hear and the paralyzed to move.
Discussion:
What are some major ethical concerns related to BCI?
How do you test without putting people’s brains at risk?
Animal studies.
Weaponization of BCI.
DARPA: technological innovations in the US military.
If used as a steroid, who would benefit?
Transhumanist ventures: if it gets to the point where, in order to be competitive, humans need these devices to keep up, will every person have access to these things, or just the wealthy?
Could “dehumanize” us.
Inability to learn form mistakes? If people are improving too much on their inefficiencies, they could lose their humanity. (Counter: are our inefficiencies our humanity?)
Where does individuality go? Shortcomings as well as strengths make us individual. If everyone can do everything…
Targeted advertising.
Social credit.
Decisions/behavior compiled as data –> push toward more extreme social credit systems. GPS tracking, privacy violations, manipulation.
Vetting informaiton accessible through it.
Security risks.
Are we in control of the ethical ramifications of BCI technologies?
Could we opt-out or unplug? Is this a “forever” change, equivalent to an evolution?
Not in control of ramifications.
Important to consider extreme circumstances early to prevent future messes.
Do humans need to become “cyborgs” in order to be relevant in the age of AI?
Could we ever compete with something that has infinite capacity to learn, and to do so more quickly than us?
Does making BCI optional alleviate moral and ethical concerns related to BCI?