Business & Economy Science & Technology Education & Society Sports & Fashion Arts & Entertainment
Top Picks Back Numbers Search

BARRIER-FREE COMPUTING:
Enabling ALS Patients to Use PCs

September 11, 2000
A special communication device makes life easier for this ALS patient. (Hitachi)

The development of hardware and software enabling people with disabilities to operate computers is proceeding apace. One example is a piece of Internet software that allows users to receive and send e-mail and browse the World Wide Web using a single switch. This software is standard on a communication device for sufferers of ALS (amyotrophic lateral sclerosis, also known as Lou Gehrig's disease) that went on sale in July 2000. Technology enabling users to control a PC by changing their facial expression is also being developed, with production scheduled to begin in 2001. What is more, a number of corporations are cooperating in research to develop a computer system with voice recognition capabilities that can respond to the user's words and emotions. The world of computing is stepping up its efforts to become barrier free.

E-Mail and the Internet with One Switch
ALS is a progressive disorder in which the nerves in the brain and spinal chord that control voluntary muscles are destroyed, leading to a loss of muscle tissue all over the body. Preliminary symptoms include weakness in the legs and speech impairment. As the disease progresses, patients may lose the ability to walk, write, speak, or even breathe. Despite these symptoms, the brain continues to function as before. This results in a communication problem in which patients are able to understand what others say yet unable to respond or express their own thoughts.

In 1997 major general electrics manufacturer Hitachi began selling "Den-no-shin" (Mind Connection)--a communication device aimed at ALS patients. Since then it has added a range of functions to the device in the hope of improving the daily lives of patients: Besides the original PC function, users can now call for assistance using pagers or remote controls for home-entertainment equipment, turn the pages of a book while reading, and control a home video game console. The Hitachi unit allows ALS patients to control a range of devices around them.

Responding to an upsurge in demands from patients to add Internet capabilities, the company also recently set about developing new functions allowing users to control a standard Internet browser and a special e-mail program using a single switch. The price of this device falls within the Ministry of Health and Welfare's subsidy limit of 500,000 yen (4,545 U.S. dollars at 110 yen to the dollar) for devices, including computers and printers, that help disabled people in their daily lives. In principle, this eliminates the need for patients to spend money on the system.

Inputting Through Facial Expressions
For ALS sufferers who have lost the ability to press a switch, researchers have already developed a sensor that uses electrostasis to detect movement in parts of the body, and further technological advances may soon let users use facial expressions rather than mechanical switches as an input method. This innovation involves loading processed images of a person's face taken with a digital camera into a computer; the "expression switch" detects such stimuli as eye movement through changes in brightness in the image. The system is scheduled for production in 2001.

For patients with advanced ALS symptoms who cannot move any part of their body, research and development of a "brain-activity switch" is moving forward. This switch monitors near-infrared waves in the brain by measuring changes in cranial blood flow caused, for example, by a person doing a sum in their head or imagining themselves clasping their hand. Certain infrared patterns might indicate "yes" and other patterns "no," thus enabling the user to communicate decisions to the device through thought alone.

One venture firm has developed a computer with voice-recognition technology that enables it to understand words and emotions. It works like this: The user says his or her name into a microphone, causing the computer to play the words "How can I help you?" through headphones worn by the user. If the user snaps at the microphone in an angry voice, the system understands this emotion by the volume and tone of voice, and replies in a similarly angry tone. It will soon be able to detect happiness, sadness, and other emotions. Over 30 companies are currently involved in this research, and the resulting advances in technology look set to expand the computing horizons of disabled people in all manner of ways.




Back to Main Index



Trends in JapanCopyright (c) 2000 Japan Information Network. Edited by Japan Echo Inc. based on domestic Japanese news sources. Articles presented here are offered for reference purposes and do not necessarily represent the policy or views of the Japanese Government.