Loading Jolly's IBOtoolbox mobile content... Loading depends on your connection speed!

Jolly swift


Active contributor | offline
Member since 7 / 2019

Your brain needs a right to privacy

Over the past few weeks, Facebook and Elon Musk’s Neuralink have announced that they’re building tech to read your mind — literally.

 

Mark Zuckerberg’s company is funding researchers who say they’ve built an algorithm that can decode words from brain activity in real time. Musk’s company has created flexible “threads” that can be implanted into a brain and could one day allow you to control your smartphone or computer with just your thoughts. Musk wants to start testing in humans by the end of next year.

 

Your brain, the final privacy frontier, may not be private much longer.

 

Companies say they’re building this brain tech for ethical purposes, like helping people with paralysis control their devices. But some neuroethicists argue that the potential for misuse is so great that we need revamped human rights laws — a new “jurisprudence of the mind” — to protect us.

 

I spoke this week to one of the people making this argument, Zurich-based neuroethicist Marcello Ienca. In 2017, he released a paper outlining four specific rights for the neurotechnology age he believes we should enshrine in law. When I asked him to explain each right and give a concrete example of how neurotechnology might violate it, he came up with some frightening scenarios, some of them already underway. Here they are:

 

1. The right to cognitive liberty: You have the right to freely decide you want to use a given neurotechnology, or to refuse it.

 

In China, the government is already mining data from some employees’ brains by having them wear caps that scan their brainwaves for depression, anxiety, or fatigue. “If your employer wants you to wear an EEG headset to monitor your attention levels, that might qualify as a violation of the cognitive liberty principle,” Ienca said, because even if you’re told that wearing the device is optional, you’ll probably feel implicit pressure to do so since you don’t want to be at a competitive disadvantage.

 

2. The right to mental privacy: You have the right to seclude your brain data if you want to, or to publicly share it. 

 

Ienca urged me to think about neurotechnology’s implications for government surveillance. “If brain-reading devices have the ability to read the content of thoughts,” he said, “in the years to come governments will be interested in using this tech for interrogations and investigations.” The right to remain silent and the principle against self-incrimination could suddenly become meaningless.

 

3. The right to mental integrity: You have the right not to be harmed physically or psychologically by neurotechnology.

 

Brain-computer interfaces like the devices being built by Facebook and Neuralink may be vulnerable to hacking. What happens if you’re using one of them and a malicious actor intercepts the Bluetooth signal, increasing or decreasing the voltage of the current that goes to your brain — thus making you more depressed, say, or more compliant? 

 

Neuroethicists refer to that as brainjacking. “This is still hypothetical, but the possibility has been demonstrated in proof of concept studies,” Ienca said, adding, “A hack like this wouldn’t require that much technological sophistication.” 

 

4. The right to psychological continuity: You have the right to be protected from alterations to your sense of self that you did not authorize.

 

In one study, an epileptic woman who’d been given a brain-computer interface came to feel such a radical symbiosis with it that, she said, “It became me.” Then the company that implanted the device in her brain went bankrupt and she was forced to have it removed. She cried, saying, “I lost myself.”

 

Ienca said that’s an example of how psychological continuity can be disrupted not only by the imposition of a neurotechnology, but also by its removal. “This is a scenario in which a company is basically owning our sense of self,” he said.

 

I asked Ienca whether neurotechnologies should be taken out of the control of private companies and reclassified as public goods. He said yes — both to prevent companies from inflicting harm, and to prevent them from affording benefits only to rich people who can pay for their products. “One risk is that these technologies could become accessible only to certain economic strata and that’ll exacerbate preexisting social inequalities.”

 

Several countries are already pondering how to enshrine “neurorights.” In Chile, two bills that would make brain data protection a human right will come before parliament for a vote in November, thanks in part to the advocacy of neuroscientist Rafael Yuste. In Europe, Ienca told me, the OECD will likely release a new set of principles for regulating the use of brain data by next year.

 

I can’t say for sure whether the OECD’s neurorights, or Chile’s, or Ienca’s, will effectively keep neurotechnology’s risks in check. But given how fast this tech is developing, I do think we need new rights to protect us, and I’m glad experts are moving to enshrine them before it’s too late. 


Press Release comments:

What a twist of the times. The rapid advancement of technology for both the better and worse. Who'd ever thought there'd come a day in which we were talking about privacy of thoughts.

Thanks for posting this PR!
Terrance Collins

Great Press Release. Keep up the good work. I am looking forward to the next one.Frank Andrews

Great Press Release. Thank you for sharing Keep up the good work!Bob & Shirley Rushing