MELANIE SWAN
Systems-level thinker; futurist; principal, MS Futures Group; founder, DIYgenomics
A worry not yet on the scientific or cultural agenda is neural-data privacy rights. Not even biometric-data privacy rights are in purview yet, which is surprising, given the personal data streams amassing from quantified self-tracking activities. There are several reasons why neural-data privacy rights could become an important concern: First, personalized neural data streams are already available from sleep-monitoring devices, and this could expand to eventually include data from eye-tracking glasses, continuously worn consumer EEGs, and portable MRIs. At some point, the validity and utility of neural data may be established with correlation to a variety of human health and physical and mental performance states. Despite the sensitivity of these data, security may be practically impossible. Malicious hacking of personal biometric data could occur and would need an in-kind response. There could be many required and optional uses of personal biometric and neurometric data for which we would need different permissioning models.
Personal biometric data streams are growing as people engage in quantified self-tracking with smartphone applications, biomonitoring gadgets, and other Internet-connected tools. The adoption of wearable electronics (smartwatches, disposable patches, augmented eyewear) could hasten this and might even outstrip tablets (now the most quickly adopted electronics platform). This could allow the unobtrusive collection of vast amounts of previously unavailable objective metric data—including not only biometrics, such as cortisol (stress) levels, galvanic skin response, heart-rate variability, and neurotransmitter levels (dopamine, serotonin, oxytocin), but also robust neurometrics, such as brain signals and eye-tracking data formerly obtainable only in the lab. These data might then be mapped to predict an individual’s mental state and behavior. Objective metrics could prompt growth in many scientific fields, with a new understanding of cognition and emotion and the possibility of addressing problems like consciousness.
The potential application of objective metrics and quantitative definitions to mental processes also raises the issue of neural-data privacy rights, especially if technological advancement means easier detection of others’ states (imagine a ceiling-based reader detecting the states of a whole roomful of people). Biometric data is sensitive as a health-data privacy concern and neural data even more so. There’s something special about the brain which is deeply personal, and the tendency is toward strong privacy in this area. For example, many people are willing to share their personal genomic data but not their Alzheimer’s disease risk profile. Neural-data privacy rights could be a worry but are overall an invitation for progress. Tools are already in development that could help: diverse research ecosystems, tiered user participation models, and a response to malicious hacking.
Given the high potential value of neural data to science, it’s likely that privacy models will be negotiated to move forward with projects. There could be pressure to achieve scale quickly, both in the amount and types of data collected and the validity and utility of the data (still at issue in some areas of personalized genomics). Raw data streams need to be linked to neurophysiological states. Already an ecosystem of open and closed research models is evolving to accommodate different configurations of those conducting and participating in research. One means of realizing scale is through crowd sourcing, both for data acquisition and analysis. This could be particularly true here as low-cost tools for neural-data monitoring become available to consumers and interested individuals contribute their information to an open data commons. Different levels of privacy preferences are accommodated, as a small percentage of those comfortable sharing their data opt in to create a valuable public good usable by all. Even more than has happened in genomics (but not in longitudinal phenotypic data), open-access data could become a norm in neural studies.
Perhaps not initially, but in a later mainstream future for neural data, we might have a granular tiered permissioning system for enhanced privacy and security. A familiar example is the access tiers (family, friends) in such social networks as Facebook and Google Plus. With neural data, we could have similar (and greater) specificity—for example, allowing professional colleagues into certain neural data streams at certain times of day. However, there may be limitations related to a current lack of understanding of neural data streams generally and how signals may be transmitted, processed, and controlled.
The malicious hacking of neural data streams is a potential problem. Issues could arise in both hacking external data streams and devices (like any other data security breach) and hacking communication going back into the human. The latter is too far ahead for useful speculation, but the precedent could be that of spam, malware, and computer viruses. These are “Red Queen” problems, where perpetrators and responders compete in lockstep, effectively running to stay in place, often innovating incrementally to temporarily outcompete the other. Malicious neural-data-stream hacking will likely not occur in a vacuum; we can expect unfortunate side effects, and we’ll need responses analogous to antivirus software.
Rather than being an inhibitory worry, the area of neural-data privacy rights invites us to advance to a new node in societal progress. The potential long-term payoff of the continuous bioneurometric-information climate is significant. Objective metrics data collection and its permissioned broadcast might greatly improve both knowledge of the self and our ability to understand and collaborate with others. As personalized genomics has helped destigmatize health issues, neural data could help destigmatize mental health and behavioral issues, especially by letting us infer the issues of the many from the data of the few. Improved interactions with the self and others could free us to focus on higher levels of capacity, spending less time, emotion, and cognitive load on evolutionary-relic communications problems while transitioning to a truly advanced society.