Envision an electric wheelchair or a drone that users could maneuver with their minds rather than hand-operated controls.
Or a game in which players using virtual reality gear are able to try to escape from a make-believe prison using tools they select through their electrical brain impulses.
Or a communications system that allows firefighters or soldiers in noisy, challenging environments to communicate via unspoken thoughts via system that makes mental telepathy real.
These are just a few of the use cases that are under development for what is called brain-to-computer interfaces (“BCI”).
In fact, some BCIs are already in use including devices that stimulate the brains of people with Parkinson’s Disease to help aid some of the rigidity, slowness, and tremors common in Parkinson’s patients. A company called Smart Cap offers a headband that can alert long-haul truckers (and their employers) when they are drowsy. And when he’s not building electric cars, rockets or solar panels or boring tunnels, Elon Musk, through his Neuralink company, is developing a chip that would be implanted in people’s brains to simultaneously record and stimulate brain activity.
And, although these systems offer the potential to improve and even save lives, they carry risks including the potential for extremely personal information to get into the wrong hands and even for hackers to take over control of brain-connected devices. (And you thought it was bad when they got your Equifax data!)
To address some of these risks The Future of Privacy Forum and the IBM Policy Lab recently released a report called “Understanding the Data Flows and Privacy Risks of Brain-Computer Interfaces.”
The report outlines potential use cases and risks associated with BCIs and also provides practical guidance for policy makers and developers.
“Emerging innovations like neurotechnology hold great promise to transform healthcare, education, transportation, and more, but they need the right guardrails in place to protect individuals’ privacy,” said IBM Chief Privacy Officer Christina Montgomery in a statement issued along with the report.
Near-Term Use Cases
The Report identifies the following as the areas in which near-term uses may be made of BCIs:
- Healthcare: BCIs can assist in diagnosing medical conditions, stimulating
- or modulating brain activity and controlling prosthetic limbs and external devices.
- Gaming: BCIs may augment existing gaming platforms and offer players new ways to play using devices that record and interpret neural signals.
- Employment and Industry: BCIs could monitor worker engagement to improve safety during high-risk activities, alert workers or supervisors to danger, modulate workers’ brain activity to improve performance, and provide tools to more efficiently complete tasks.
- Education: BCIs could track student attention, identify students’ unique needs and alert teachers and parents of student progress.
- Smart Cities: BCIs could provide new ways of communication for construction teams and safety workers and enable potential new methods for connected vehicle control.
- Neuromarketing: Marketers could incorporate the use of BCIs to understand consumers’ moods and to gauge product and service interest. (Author’s Note: Please, God, no!)
- Military: Governments are researching the potential of BCIs to help rehabilitate soldiers’ injuries and enhance communication.
The Report focuses on controlling and protecting “neurodata,” data that comes from the brain or other parts of the nervous system and is generated by BCIs.
The Report notes that BCIs “also raise important technical considerations and ethical implications, related to, for example fairness, justice, human rights, and personal dignity.”
Although there are already frameworks in place in some areas – including Europe’s General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA) and China’s newly enacted Personal Information Protection Law (PIPL) and biometric laws in some U.S. states, these frameworks may not fully capture some of the considerations that are specific to neurodata.
Policy Safeguards
Accordingly, the Report recommends the following policy safeguards:
- Ensure BCI-derived inferences are not allowed for uses to influence decisions about individuals that have legal effects, livelihood effects, or similar significant impact such as assessing the truthfulness of statements in legal proceedings, inferring thoughts, emotions or psychological state, or personality attributes as part of hiring or school admissions decisions, or assessing individuals’ eligibility for legal benefits;
- Employ sufficient transparency, notice, terms of use, and consent frameworks to empower users with a baseline understanding around the collection, use, sharing, and retention of neurodata;
- Engage Institutional Review Boards and other independent review mechanisms to identify and mitigate risks;
- Facilitate community input prior to and during BCI system design, development and rollout;
- Create dynamic technical, policy, and employee training standards to account for gaps in current regulation;
- Promote an open and inclusive research ecosystem by encouraging the adoption of open standards for neurodata and the sharing of research data under open licenses and with appropriate safeguards in place.; and
- Evaluate the adequacy of existing policy frameworks for governing the unique risks of neurotechnologies and identifying potential gaps prior to new regulation.
Technical Safeguards
The Report also makes a number of technical recommendations including the following:
- Provide on/off controls when possible— including hardware switches if practical;
- Provide users with granular controls on devices and in companion apps for managing the collection, use, and sharing of personal neurodata;
- Provide heightened transparency and control for BCIs that specifically send signals to the brain, rather than merely receive neurodata;
- Design, document and disclose clear and accurate descriptions regarding the accuracy of BCI-derived inferences;
- Operationalize industry or research-based best practices for security and privacy when storing, sharing, and processing neurodata;
- Employ appropriate privacy enhancing technologies;
- Encrypt personal neurodata in transit and at rest; and
- Embrace appropriate protective and defensive security measures to combat bad actors.
“We have a prime opportunity now to implement strong privacy and human rights protections as brain-computer interfaces become more widely used,” said Jeremy Greenberg, Policy Counsel at the Future of Privacy Forum. “Among other uses, these technologies have tremendous potential to treat people with diseases and conditions like epilepsy or paralysis and make it easier for people with disabilities to communicate, but these benefits can only be fully realized if meaningful privacy and ethical safeguards are in place.”
For a deeper dive: “Understanding the Data Flows and Privacy Risks of Brain-Computer Interfaces.”