A Vox report that swiftly sparked alarm across the internet Friday outlined how, “in the era of neurocapitalism, your brain needs new rights,” following recent revelations that Facebook and Elon Musk’s Neuralink are developing technologies to read people’s minds.
As Vox‘s Sigal Samuel reported:
Considering those and other companies’ advances and ambitions, Samuel warned that “your brain, the final privacy frontier, may not be private much longer” and laid out how existing laws are not equipped to handle how these emerging technologies could “interfere with rights that are so basic that we may not even think of them as rights, like our ability to determine where our selves end and machines begin.”
Samuel interviewed neuroethicist Marcello Ienca, a researcher at ETH Zurich who published a paper in 2017 detailing four human rights for the neurotechnology age that he believes need to be protected by law. Ienca told Samuel, “I’m very concerned about the commercialization of brain data in the consumer market.”
Click Here: los jaguares argentina
“And I’m not talking about a farfetched future. We already have consumer neurotech, with people trading their brain data for services from private companies,” he said, pointing to video games that use brain activity and wearable devices that monitor human activities such as sleep. “I’m tempted to call it neurocapitalism.”
The Vox report broke down the four rights that, according to Ienca, policymakers need to urgently safeguard with new legislation:
- The right to cognitive liberty: You should have the right to freely decide you want to use a given neurotechnology or to refuse it.
- The right to mental privacy: You should have the right to seclude your brain data or to publicly share it.
- The right to mental integrity: You should have the right not to be harmed physically or psychologically by neurotechnology.
- The right to psychological continuity: You should have the right to be protected from alterations to your sense of self that you did not authorize.
“Brain data is the ultimate refuge of privacy. When that goes, everything goes,” Ienca said. “And once brain data is collected on a large scale, it’s going to be very hard to reverse the process.”
SCROLL TO CONTINUE WITH CONTENT