Human Computer Interfaces; Emerging Technologies.
by: kira hammond (2005/07/25) | journal contents |
Humans communicate with computers in a variety of different ways. Early on there were options like the keyboard for entering in data, and the light pen, for drawing or selection purposes. Over the years a number of human computer interfaces have been developed such as the mouse, the drawling tablet, and speech recognition programs. All of these interfaces require mediation between the human and the computer. Most often this mediator is a hardware device that translates some sort of movement to a binary number that the computer can then process. The keyboard requires moving the hands and fingers to activate keys; the keys send binary information to the computer. The computer uses software to interpret the data and responds by sending a number, letter or symbol to the computer interface.
Problems have arisen with controllers like the mouse and the keyboard. Ergonomically incorrect, these pieces of equipment often cause medical problems, such as carpal tunnel syndrome, after periods of extended use. People with disabilities, the very young and the very old have trouble controlling computer interfaces such as the keyboard and the mouse. New human computer interfaces allow the end user to control the computer in new and exciting ways.
The immersive experience when interfacing with a computer is another area in which current computer interfaces are particularly weak. By moving away from bulky or obvious mechanical interfaces the user can feel more connected or immersed in the experience of using a computer. Research groups are looking to use immersive computer interfaces called virtual reality environments to simulate experiences such as flying a plane in order to train pilots. Such environments benefit from transparent technology and quick reaction from the computer; both can be achieved with the use of new human computer interfaces.
With the advent of new technology researchers and computer users are looking for alternate ways for humans to interface with machines. Creating a more organic union between man and machine is a step towards a more natural interface. Taking out the mediator between a computer and a human potentially quickens response time and decreases the possibility of errors, important considerations for doctors and pilots who are interested in using human computer interfaces to control equipment.
Many human computer interfaces achieve their unique level of control through the use of tracking. Tracking is a way to capture the motion of humans. Researchers can track body movement, eye movement or electrical signals from the brain. Tracking technology always requires software to interpret the data collected by the tracking device. Each company or research group that is experimenting in the area of human computer interfaces develops their own unique software and tracking system to work in tandem. Two problems plague tracking technologies: latency, the time delay between when an event occurs and its observation, and the computer processing power required to handle a brain computer interface technology.
Tracking can be done through a number of technological innovations. Most of these tracking technologies were developed for use in human computer interfaces. One of the first and still most widely used tracking systems is the mechanical tracker. The mechanical tracking system is a hardware unit that stands alone and fixes itself over or beside the user. It often takes the form of an arm. The mechanical arm is fixed at one end and protrudes into an elbow and hand like configuration on the other end. The mechanical arm measures the joint angles using transducers this data can determine where a human is moving. The information is transmitted to a computer with a high degree of accuracy and low latency, making it an excellent brain computer interface for applications like virtual reality simulators.
Another tracking device uses optical technology involving infrared video cameras that record the movement of a person. Attached to the person is a collection of markers in the form of small balls fixed to joints. An infrared light illuminates the small balls and as the person moves the data is fed into a computer system. Since the system relies on lights it only works in line of sight applications.
Ultrasonic trackers employ sound to locate the position of the user's head. The ultrasonic tracker is placed on top of the playback screen and records the user's head movements then alters the appropriate perspective view in the display. This technology relies on line of sight as well but it is simpler and easier to afford than many other tracking technologies.
Electromagnetic tracking technology can monitor the orientation of the user's head and hand. The system emits an electromagnetic field and a sensor reflects the field. When the sensor is moved it detect different magnetic fields that encode its position and orientation. The decoded signals are relayed to the playback until. The latency on electromagnetic systems is very low and it allows for large areas to be monitored in terms of movement.
Eye tracking technology is another way to measure the response patterns of the user and map those responses to computer commands. Eye tracking measures movement and changes in the size and shape of the pupil and corneal. The eye tracker can detect changes in detection of gaze angle and focus of attention. By measuring the movements of the corneal and the pupil eye-tracking systems can tell where a person is looking and weather or not the person is just gazing or if they are concentrating. Eye-tracking interfaces control a computer through the use of this system of measurement.
States of consciousness can be measured by looking for changes in brain states. Frequency and amplitude of the brain waves change in different states of consciousness like alertness, lethargy and dreaming. EEG or Electroencephalographs are a way to measure brain wave activity. Other ways of monitoring and measuring the body's electrical activity, such as Electrocardiograms (EKG), a test that records the electrical activity of the heart and the Myogram (EMG), a test that measures muscle response to nervous stimulation exist, but brain computer interfaces explicitly use EEG as the main way to deduce brain states. EEGs can give valuable information about the functions of the brain as well. Electroencephalograms are used in neurology and psychiatry to help diagnose diseases of the brain. By measuring the EEG readings from an individual some BCIs can recognize and respond to states of consciousness or brain states inducing concentration or relaxation. When combined with software EEG readings are another way to control a computer.
Tracking technologies are an important area of research for conventional and new human computer interfaces. One human computer interface that relies heavily on the use of tracking system is virtual reality.
Virtual reality or VR is one realm of brain computer interfaces. VR uses computer-simulated environments to supply convincing three-dimensional data to the part of the brain responsible for processing imagery. The brain can then be measured in terms of response patterns and the computer simulation can respond accordingly. While there are still debates as to what constitutes 'virtual' reality for the sake of brevity virtual reality will be considered as immersive technology or "A technology where the hardware cuts off visual and audio sensations from the surrounding world and replaces them with computer-generated sensations." (Vince) Using this terminology the first steps towards creating VR environments were taken in the 1950s.
VR was not invented by any one person or group of people, instead it evolved from an intersection of computer science, stereoscopy, and simulation. The groups interested in developing technologies in this area were in academic, military, and commercial research laboratories. Morton Heilig, a cinematographer, was one of the first people to recognize, think about, and publish documents discussing virtual reality. Heilig thought of VR as a natural extension to cinema since it allowed the audience to be immersed in a fabricated world that could engage all of the senses. "Heilig's started to think about what would have to be accomplished to create an artificial experience that could fool people into believing that they were actually occupying and experiencing a movie set. 'How do I know I am in a particular environment?' Heilig asked himself in 1954." (Rheingold).
About the same time that head-mounted displays appeared, a radically different approach to VR was emerging. Myron Krueger, another key player in the advent of virtual reality, began creating interactive environments in which the user moves without encumbering equipment. Krueger's work in the 1960's used cameras and monitors to project a user's body so it could interact with graphic images, allowing hands to manipulate graphic objects on a screen. The burden of input rests with the computer, and the body's free movements become data for the computer to read. Cameras follow the user's body, and computers synthesize the user's movements with the artificial environment.
Concurrent research in the 1950's and 60's focused on military applications. A prime example of immersion VR for military purposes comes from the U.S. Air Force, which first developed virtual reality hardware for flight simulation. The computer generates much of the same sensory input that a jet pilot would experience in an actual cockpit. Another model of VR is the system developed at NASA called VIEW or Virtual Interface Environment Workstation. NASA uses the VIEW system for tele-robotic tasks. A tele-robotic task is one where an operator on earth is immersed in a virtual environment that is simulating a remote environment, this remote operator can then manipulate objects through feedback from the VR system while a robot in the distant location carries out the operators commands.
Jaron Lanier developed other applications for virtual reality. He built upon the ideas of the immersion model of virtual reality but added equal emphasis to another aspect, communication. Because computers make networks, virtual reality seemed a natural candidate for a new communications medium. Lanier created RB2, or Reality Built for Two, a shared construct or virtual world in which participants work to co-create the environment. Lanier hopes that future generations will use VR to communicate like a telephone, to connect with people in distant parts of the world.
Virtual reality competes with two-dimensional and three-dimensional graphic interfaces available today. Two-dimensional graphics are used in everything from medical illustrations to advertisements. Because the files are very small two-dimensional graphics are also used on the Internet. Researchers use three-dimensional models to explore a large range of objects from human anatomy to atoms and sub atomic particles. Flight simulators that use a computer screen interface with vector graphics to represent planes and surfaces are being used today however these simulators are not as realistic as a VR environment would be. The military is moving towards the use of virtual reality for this very reason.
Movies and other forms of popular entertainment media are competitors to the entertainment applications for virtual reality. Movies and visual forms of entertainment could also be seen as predecessors to virtual reality. Combing virtual reality with movies may create interesting new art forms. Video games that are moving towards three-dimensional characters and spaces represent a multi-million dollar commercial market. Currently, the play area is still limited to the TV screen but if virtual reality were introduced to gaming this might change. Virtual reality is not the only cutting edge human computer interface being developed there are other human computer interfaces such as brain computer interfaces that are attempting to circumvent the need for tracking systems by taping into higher brain functions to control a computer.
A group of technologies exploring the possibilities of alternate control interfaces using the brain as the initial signal generator are called brain computer interfaces or BCI. A BCI is a system that acquires and analyzes neural (brain) signals with the goal of creating a high bandwidth communications channel directly between the brain and the computer.
To better understand BCI one must understand the technology that comes together to create all of the different BCI systems. There are a few basic components to all brain computer interfaces, and they are data collection units, data playback units, and display units. Each system must have a way to gather and hold data in order to respond to humans' commands. The data from both the data collection units and the participant are integrated and then played back to the user. With the advancement of computer processing power most current BCIs use a computer for both the collection and playback of information. The display unit can be auditory, tactile or visual but there must be a way to show the data to the user so that they may respond and interact with the technology.
While existing technologies are still available to control a BCIs improve upon the computer interface to allow even the most severely handicapped to communicate with a computer. Medical conditions that effect the motor skills of afflicted patients such as tremors, cerebral palsy, and paralysis make using traditional computer interfaces like the keyboard impossible. Even advanced eye tracking technology that replaces a mouse by correlating eye movement to an area on a computer screen is often too difficult to control when motor skills are lost. Such medical conditions require a new kind of human computer interface.
"The immediate goal is to provide these users, who may be completely paralyze, or "locked in" with basic communication capabilities so that they can express their wishes to caregivers or operate simple word processing programs." (Sherwood)
BCIs give disabled or handicapped citizens a chance to communicate and interact with their surroundings as well as with other people. The BCI that is being used for those who have medical or health conditions uses electrodes that transmit and receive EEG readings which are in turn sent back and forth to the computer and human user. Handicapped persons are not the only ones who can benefit from BCIs that use electrical impulses to control a computer. The very young and very old also have difficulty with the fine degree of motor skills required to use some computer interfaces like the keyboard or the electronic drawling stylus (very similar to the light pen except the drawling stylus uses an electronic tablet instead of the computer screen). Computer users that have suffered from the design flaws of the keyboard and mouse will be potential customers for the companies working on such BCIs. In addition many scientists are looking to brain computer interfaces to help diagnosis and treat brain disorders such as sleep disorders, neurological diseases, attention monitoring, and overall mental state deficiencies like depression. Science is also looking forward to creating a new tool for neuroscience research where a real-time method for correlating observable behavior with recorded neural signals is possible.
BCIs that use EEG readings to control a computer are possible because of recent technological advances including development of both invasive electrode arrays and non-invasive or high-density EEG techniques. New advances in sophisticated machine learning and signal processing algorithms take advantage of cheaper and faster computing power to enable online real-time processing. In addition there have been advances in the underlying science of the brain, neuroscience. In this decade there is a better understanding of the neural code, the functional neuroanatomy (the brain's physiology) and how these are related to perception and cognition.
These technological advances can be attributed to research and development by medical, military, and commercial groups. While each group has its own private interests most applications do revolve around making a viable solution to reading and analyzing data from the human brain.
There is no one innovator of the BCI application for medical use this is because the technology is still in the development stages. There is no one set-up for BCI EEG interfaces, many research groups have developed their own unique systems and patented new technology. Two researchers who hold patents on computer interface technology are Dr. Geoffrey Wright and Dr. Philip Kennedy. Dr. Wright is an entrepreneur who holds several patents for sophisticated brain-wave man machine interfaces used to control and communicate with computers. Dr. Kennedy developed an electrode that once embedded in the skull can pick up nerve impulses generated by brain activity. These two researchers represent non-invasive technologies (Dr. Wright's patents) and invasive technologies (Dr. Kennedy's patents) that allow humans to interface with computers using electrical signals to convey information.
Currently there are projects underway by both of these researchers to use this technology to meet the needs in the medical community. Dr. Wright founded a company called Neurosonics, which is developing ways for electrodes to be placed on the head, and connected to computers through wires, forming a connection between the user and computer. Dr. Kennedy's group, Neural Signals, is working on making implant technology more stable and affordable. To this end Neural Signals is collaborating with Georgia Institute of Technology, Georgia State University, and Emory University to further neuroprosthesis as a tool for use by people with the implant technology patented by Dr. Kennedy.
Other research groups such as Georgia State University are developing software to convert the electrical signals from Dr. Kennedy's implant technology into a means of communication. Melody Moore the head of the BCI research center at Georgia State developed a computer program called TalkAssist in 1998 which interprets and translates raw data it receives from the EEG transmitter. The system allows a patient to generate a brain signal to move a cursor or a mouse arrow to select letters, words or icons from a computer menu. Moore is focusing on patients who have almost no ability to produce any type of muscle movement. The electrical impulses generated by the brain to move an arm or a leg can be used as a substitute signal to generate a letter word or phrase in the computer that the patient wishes to convey.
Still other companies are looking to find a way to increase mental well being using BCIs and a technique called neurofeedback. The process called neurofeedback involves connecting electrical impulses from the user's brain to the computer and back again, creating a feedback loop between the computer and the user. Neurofeedback allows the computer to interact with the user through electric impulses, the very root of brain function. EEG Spectrum International Inc's manufactures devices that use neurofeedback technology to increase mental prowess. EEG Spectrum International Incorporated sells the Mental Fitness Training Program, a neurofeedback device and software program that is supposed to help a person learn and maintain new, more efficient attention and response patterns. The company also sells Peak Performance Training for Artists proposed to optimize talents by increasing ones concentration and focus and Peak Performance Training for Athletes, a mental fitness training program that supposedly helps the athlete become more self-aware a skill necessary for peak performance in athletes. Such devices induce brain states that are similar to those seen on an EEG when one is learning or concentrating on a task. By artificially inducing these brain states EEG Spectrum International Incorporated hopes to provide a means of personal control of ones own mood and emotional state.
Medical uses for BCIs are faced with a number of technological hurdles. Many BCI technologies are striving to be non-invasive, as many humans do not feel comfortable and cannot afford to surgically implant devices in the skull. All of the technologies are attempting to improve upon the current methods to increase signal-to-noise ratio (SNR), signal-to-interference ratio (SIR) as well as optimally combining spatial and temporal information to transmit the most accurate information possible. New research into feedback has many BCI researchers attempting to develop co-learning or jointly combined man-machine system to take advantage of feedback. Lastly creating a working map of a given task to the brain state of the user is constantly being worked upon and improved.
BCI are also being developed for their entertainment aspects. Some companies are focusing on the ability to relax and rejuvenate a person by altering their brain state. Neurosonics is working on such a device called BGM or Brain Generated Music. Raymond Kurzweil, a director of Neurosonics, explains how BGM works. "The BGM algorithm is designed to encourage the generation of alpha waves by producing pleasurable harmonic combinations upon detection of alpha waves, and less pleasant sounds and sound combinations when alpha detection is low. In addition, the fact that the sounds are synchronized to the user's own alpha wavelength to create a resonance with the user's own alpha rhythm also encourages alpha production." BGM therefore supports relaxation through positive reinforcement of alpha waves, which are known to occur in the brain during extreme relaxation.
Other entertainment applications for BCIs create a unique video gaming experience. IBVA, a company developing BCIs for use with game systems is hoping to create more realistic game play by incorporating non-invasive electrode technology into current video game consoles. Drew DeVito, one of the leading researchers at IBVA explains why BCI are exciting for the world of gaming during an interview with Internet Gaming Network, "There is finally a consumer product that has the power to do everything we've always wanted to do with braintracking in the home. If you're racing along in Rallisport (a racing game for the Xbox system) and your concentration is broken, our system will pick that up and kick in the handbrake to spin your control out of your hands."
Just because advanced human computer interface technology exists does not mean there is a viable commercial market for the technology. The adversities facing commercially viable forms of human computer interfaces include a social stigma against the technology. In many of the websites, journals, and articles published about cutting edge human computer interfaces there is a sentiment that the technology is too outlandish to be possible. Despite the overwhelming amount of credible research groups, scientists, company heads, and users of the technology the general public has a difficult time accepting that computers can be controlled with devices like brain computer interfaces or that the mind could think a VR experience as convincing as reality. It is as if the technology is still in the realm of science fiction rather than an actual scientific breakthrough.
The same technical issues as the medical applications for human computer interfaces affect the commercial ventures meaning many companies marketing the commercial applications are either researching the technology or aligning themselves with research facilities to improve upon the existing technology. This equates to large amounts of money being spent on simply getting the technology to a stage where it is possible to be commercially successful. The price of researching and building models to work with the average PC makes the cost of these devices rather prohibitive at this point in time. As the technology advances and the research trickles down from the military and advanced research groups working in the field of new human computer interfaces the commercial applications will be more stable and affordable.
Virtual reality, BCIs for medical applications, and human computer interfaces for entertainment purposes are all reinventing the idea of controlling a computer. Whether the end goal is to help the disabled, or entertain the public the technology behind the controllers is fresh and exciting. There is a large audience of people who would benefit from the potential of a new round of human computer interfaces. This value is both socially profound and commercially viable. The technology is extremely versatile; new applications for the technology are being discovered each day. The versatility of these new interfaces means the technology could have a very large impact on society. Since the technological concerns of new human computer interfaces push the technological capabilities of computers the scientists involved are making breakthroughs that can apply to computer systems for any application. Imagine a world where people drive their cars using an eye tracking system, virtual reality advertisements appear to interact with the window shoppers passing by a busy urban street, and a quadriplegic man runs one of the worlds most prosperous Internet company without ever moving a muscle. The above examples are just hypotheses about the future of human computer interfaces. With the advent of human computer interfaces like VR, BCI and Biofeedback devices there is no telling how far reaching the impact of these technologies will be.
Bibliography
Bayliss, J.D. Ballard. D.H. "A Virtual Reality Testbed for Brain-Computer Interface Research. IEEE Transmission. on Rehabilitation Engineering." Rochester, NY 2000.
IBVA Technologies, Inc. Darien, CT. 1997 www.ibva.com
IGN. "All Your Brainwaves Are Belong To Us." March 26, 2002 http://xbox.ign.com/articles/356/356216p1.html
Knapp, Ben. Lusted, S. Hugh. "Controlling Computers with Neural Signals." Sceintific American. October, 1996. http://www.biocontrol.com/
Kurzweil, Ray. The Age of Spiritual Machines: When Computers Exceed Human Intelligence. Penguin Putnam Inc, New York, NY 2000.
Bayliss, J.D; Ballard. D.H."The Effects of Eyetracking in a VR Helmet on EEG Recording." TR 685, University of Rochester National Resource Laboratory for the Study of Brain and Behavior. Rochester, NY 1998.
Laboratory for the Study of Complex Human Behavior. "Proceedings of the SPIE - The International Society for Optical Engineering, Vol. 3639B, The Engineering Reality of Virtual Reality." San Jose, CA, 1999.
Neural Signals. Atlanta, GA. www.neuralsignals.com
Neurosonics. Owings Mills, MD. www.neurosonics.com
Pelz, Jeff B:, Hayhoe, Mary M:, Ballard, Dana. "Development of a virtual laboratory for the study of complex human behavior." Carlson Center for Imaging Science, Rochester Institute of Technology, San Jose, CA, 1999.
Packer, Randall. Jordan, Ken. Multimedia from Wagner to Virtual Reality. W.W. Norton & Co. New York, NY 2001.
Parra, Lucas. Pearlmutter, Barak. Tang, Akasha. "Brain Computer Interfaces." University of Columbia Press, New York, NY 2001. http://www.columbia.edu/~ps629/researchCAD2.htm
Rheingold. Howard. Virtual Reality. Touchstone, New York, NY 1991.
Schalk, Gerwin. McFarland, Dennis J. Wolpaw Jonathan R. "Brain Computer Interfaces for Communication and Control." Laboratory of Nervous System Disorders, Albany, NY. http://newton.bme.columbia.edu/wolpaw.html
Sherwood, Jonathan. "Give it a Thought and Make it So." University of Rochester, 2000. http://www.rochester.edu/pr/releases/cs/bayliss.html
Vince, John. Essential Virtual Reality. Springer Publishing, New York, NY. 1998.
Copyright ? 2002 soundtoys.net™ All artists rights reserved.