The field of physiological computing is still quite new, but research has suggested that different physiological computers require varying degrees of intentionality from the human user, and that the devices can be placed on a spectrum.3
Via physiologicalcomputing.net |
On one end of the spectrum are technologies where users can deliberately interact with input devices based on voluntary muscle movement such as electrooculography (EOG) to direct the movement of a cursor (shown in 2 on the spectrum).4 In contrast, brain-computer-interfaces (BCI) such as the exoskeleton showcased at the recent first kick for the 2014 World Cup, bypass this step since BCIs are often developed for those with diminished movement capacities and disabilities. However, in both cases the general principle is the same: the interface is ultimately translating a neural signal that the user has specifically and deliberately directed to complete a task.5
Via cbsnews.com |
Non-deliberate PC, on the other hand, bypasses any voluntary input, and instead involves a “biocybernetic” approach where spontaneous physiological changes, such as a heart rate or brain electrical signals are recorded via an electrocardiogram (EKG) or an electroencephalogram (EEG), respectively. These signals are then correlated to meaningful information, such as the case mentioned above where specific EEG signals act as identifying information to allow access to a computer. These types of technologies are able to associate recorded physiological changes with the motivational, cognitive or emotional state of the user. Once the interface determines the user’s emotional state, it can often adapt in an attempt to promote a specific type of positive mentality or negate a potentially hazardous emotional state. For example, if a computer calculates that the user is stressed, it can play soothing music or offer to help to diffuse the negative situation. The long-term recording of physiological data usually for learning purposes is referred to as ambulatory monitoring.6
Via thenextweb.com |
Technologies that incorporate aspects of physiological computing, such as the recently released Kinect 2 from Microsoft, have recently become prevalent in consumer products. Using technology similar to that developed at MIT and referred to as Eulerian Video Modification,7 the camera on the Kinect detects small changes in skin color pigmentation and monitors heart rate optically (although pulse rate can be an indicator for an emotional state, at this time the Kinect 2 focuses on monitoring heart rates during physical activity, but does not correlate this data to an emotional state).
Portable, wireless sensors that are able to not only record, but also convert raw EEG signals into some form of meaningful information are currently available. EPOC by Emotiv and MindWave by NeuroSky have developed and currently sell wireless headsets that act as EEG sensors. Since certain EEG signals could be used as an indicators of a specific emotional state, such as frustration,8 the interface can label or adapt to a user in real-time. That said, while these EEG sensors give the impression that the user can execute commands with seemingly only the power of thought, these technologies are not yet able to comprehend intentions or mimic emotions (but, see recent data on AI recently passing Turing Test). For an interface to recognize intentions, first a system, similar to a dictionary, must be created so that the computer records the EEG data for a series of tasks that the interface will be able to recognize later. Not to mention, “intent” is still not clearly understood mechanistically through neuroscience.
Pertinent ethical issues include those related to ownership and privacy. Raw EEG or electrocardiogram (ECG) data is powerful information, especially when linked to changes in an emotional state. Emotiv will provide the raw EEG data from its users for an additional fee, but NeuroSky does not provide this information. Do we have any claim over our own (neuro-)physiological data once it leaves us? Even if raw EEG signals are worthless without an algorithm to decipher the meaning, the data still originated from only one, original source. Until it was pulled for ownership issues (NASA wanted to ensure that the data was no longer federal property), the EKG of Neil Armstrong’s heart as he took the first steps on the moon was to be auctioned off last year.9 But did NASA ever have a right to lay claim to this information, even if without an algorithm the EKG is seemingly meaningless? Or, does Neil Armstrong (or in this case, his family) have any right to claim ownership since NASA paid for and played a role in developing the technology that enabled this collection? These will be the types of questions that need to be addressed as more and more people continue to offer up their physiological data by using these types of technologies and popular commercial venues.
Via time.com |
It seems inevitable that one day enough people will participate in the use of these EEG sensors and a massive database of neurological signals will begin to develop. Having a large dataset of neurological data that can potentially be correlated to disease states is already the goal of well established companies such as Lumosity 10 and BrainResource.11 Additionally, the United States government recently launched PCORnet: The National Patient-Centered Clinical Network Project with the intention of building a national health-data system by combining data from 29 different health data networks.12 The United Kingdom has met ethical conflicts with the introduction of a similar system, care.data,13 and the United States already has a history of alleged National Security Agency privacy violations, but government backed organizations are moving forward with the massive collection of medical records and perhaps one day, extensive physiological data. A precedent for having a dataset of extensive, personal information is the company 23andMe, which provided information based on DNA analysis. Nothing is protecting the users of 23andMe’s service from having their personal information sold,14 but the Genetic Information Nondiscrimination Act (GINA) passed in 2008 protects people from having their genetic information interfere with insurance policies and employment. This type of law does not exist for neurological data. Regulations and discussions should be taking place now before companies like Emotiv or NeuroSky have 5 years’ worth of data from their customers whose privacy is not protected in the slightest.
Already specific EEG signals can be used to characterize neurological disorders. With the collection of more data, we have the potential to be able to recognize and use specific signals as “brain signatures” for other neurological disorders or even tendencies toward certain behaviors (The well-established company Brainwave Science is a proponent of using EEG technology to test guilt or innocence). This ability, while incredibly powerful, has a high risk for abuse in terms of covert monitoring of individuals.15 Of course, if a patient has epilepsy, a discrete EEG sensor that has the power to be predictive for seizure activity could greatly increase the health, safety, and quality of life for these patients.16 Would it be appropriate to monitor a person who has been given a neurological diagnosis that has rendered them emotionally unstable if the EEG sensor could detect a very high or low state though? If that EEG sensor means that they are deemed stable enough for certain activities they were once denied, such as driving, does that make the constant monitoring worth what many would consider a violation of privacy?
References
(1) New Research: Computers That Can Identify You by Your Thoughts http://www.ischool.berkeley.edu/newsandevents/news/20130403brainwaveauthentication (accessed Jun 26, 2014).
(2) Fairclough, S. H. Fundamentals of Physiological Computing. Interact. Comput. 2009, 21, 133–145.
(3) Physiological Computing F.A.Q. Physiological Computing Blog. http://www.physiologicalcomputing.net/?page_id=227 (assessed on June 28, 2014).
(4) Allanson, J.; Fairclough, S. H. A Research Agenda for Physiological Computing. Interact. Comput. 2004, 16, 857–878.
(5) Allison, B. Z.; Wolpaw, E. W.; Wolpaw, J. R. Brain-Computer Interface Systems: Progress and Prospects. Expert Rev. Med. Devices 2007, 4, 463–474.
(6) Fairclough, S.H., and Gilleade, K. (2014). Meaningful Interaction with Physiological Computing. In Advances in Physiological Computing, S.H. Fairclough, and K. Gilleade, eds. (Springer London), pp. 1–16.
(7) Wu, H.-Y.; Rubinstein, M.; Shih, E.; Guttag, J.; Durand, F.; Freeman, W. T. Eulerian Video Magnification for Revealing Subtle Changes in the World. ACM Transactions on Graphics (Proc. SIGGRAPH 2012 2012, 31.
(8) Kapoor, A.; Burleson, W.; Picard, R. W. Automatic Prediction of Frustration. Int. J. Hum.-Comput. Stud. 2007, 65, 724–736.
(9) Pearlman, R. Z. Neil Armstrong’s “Heartbeat,” Apollo Joystick Pulled from Auction http://www.space.com/21228-neil-armstrong-apollo-artifacts-auction.html (accessed Jun 26, 2014).
(10) Sternberg, D. A.; Ballard, K.; Hardy, J. L.; Katz, B.; Doraiswamy, P. M.; Scanlon, M. The Largest Human Cognitive Performance Dataset Reveals Insights into the Effects of Lifestyle Factors and Aging. Front. Hum. Neurosci. 2013, 7.
(11) McRae, K.; Rekshan, W.; Williams, L. M.; Cooper, N.; Gross, J. J. Effects of Antidepressant Medication on Emotion Regulation in Depressed Patients: An iSPOT-D Report. J. Affect. Disord. 2014, 159, 127–132.
(12) Collins, F. S.; Hudson, K. L.; Briggs, J. P.; Lauer, M. S. PCORnet: Turning a Dream into Reality. J. Am. Med. Inform. Assoc. 2014, amiajnl–2014–002864.
(13) Callaway, E. UK Push to Open up Patients’ Data. Nature 2013, 502, 283–283.
(14) Seife, C. 23andMe Is Terrifying, but Not for the Reasons the FDA Thinks. Scientific American, Nov. 27, 2013. http://www.scientificamerican.com/article/23andme-is-terrifying-but-not-for-reasons-fda/ (accessed Jun 26, 2014).
(15) Deceiving the Law. Nat. Neurosci. 2008, 11, 1231–1231.
(16) Jouny, C. C.; Franaszczuk, P. J.; Bergey, G. K. Improving Early Seizure Detection. Epilepsy Behav. EB 2011, 22 Suppl 1, S44–48.
Want to cite this post?
Strong, K. (2014). “Pass-thoughts” and non-deliberate physiological computing: When passwords and keyboards become obsolete. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2014/06/pass-thoughts-and-non-deliberate.html
No comments:
Post a Comment