Jump to content
IndiaDivine.org

Scientists Developing Mind-Controlled Wheelchair

Rate this topic


Guest guest

Recommended Posts

Guest guest

WHEELCHAIR MOVES AT THE SPEED OF THOUGHT

By Duncan Graham-Rowe

New Scientist

July 24, 2003

 

http://www.newscientist.com/news/news.jsp?id=ns99993967

 

Severely disabled people who cannot operate a motorised wheelchair may one

day get their independence, thanks to a system that lets them steer a

wheelchair using only their thoughts.

 

Unlike previous thought-communication devices, the system does not use

surgical implants. Instead a skullcap peppered with electrodes monitors the

electrical activity of its wearer's brain. Early trials using a steerable

robot indicate that with just two days training it is as easy to control the

robot with the human mind as it is manually.

 

" It's a very positive step, " says Paul Smith, executive director of The

Spinal Injuries Association in London. " The psychological benefits it would

offer are huge. "

 

The current options to give freedom of movement to people who are

quadriplegic are limited, says Smith. For example, it is possible to steer a

wheelchair using a chin-operated joystick or by blowing into a thin tube.

But both options can be exhausting -- and they are not suitable for those

with very limited movement.

 

So José Millán at the Dalle Molle Institute for Perceptual Artificial

Intelligence in Martigny, Switzerland, along with researchers from the Swiss

Federal Institute of Technology in Lausanne and the Centre for Biomedical

Engineering Research in Barcelona, Spain, has come up with a system that can

reliably recognise different mental states.

 

If all goes to plan, it will be the first mind-controlled system able to

operate something as complicated as a wheelchair, says Millán.

 

At the moment the system controls a simple wheeled robot. The user dons the

electrode-lined skullcap, which monitors electrical activity on the surface

of the head. A web of wires sends the information to a computer. Millán's

software then analyses the brain's activity and, using a wireless link,

passes on any commands it spots to the robot.

 

At the moment the user can choose between three different commands: for

example, " turn left " , " turn right " and " move forward " . Millán's software

exploits the fact that the desire to move in a particular direction will

generate a unique pattern of brain activity. It can tell which command the

user is thinking of by spotting the telltale pattern of brain activity

associated with that command.

 

To ensure the robot does not hit any objects, it contains some inbuilt

intelligence. So, when the user thinks of one of the three states -- for

example, " turn left " -- the software translates it into an appropriate

command for the robot, such as " turn left at the next opportunity " .

 

In this case, infrared sensors allow the robot to detect walls and objects

and it will safely plod along until it reaches the next turning. And in case

the software has got the command wrong, a light on the robot indicates what

it is going to do, giving the user time to correct it.

 

Millán's skullcap-centred system is a significant step forward. Five years

ago, surgeons in Atlanta, Georgia, grabbed the headlines by implanting

electrodes in the brain that allowed patients to communicate by controlling

a cursor on a computer screen ( New Scientist print edition, 17 October

1998, p 5).

 

But the risks associated with such invasive methods mean approaches such as

Millán's that use electroencephalography (EEG), in which surface scalp

electrodes monitor electrical activity, are preferable.

 

However, methods using current EEG technology are slow to recognise

different mental states and can only do so by measuring the brain's alpha

waves. Since this involves shutting the eyes and relaxing, it is not a

practical option for people trying to control a wheelchair.

 

So the team has designed its own software to analyse the activity from a

standard eight-electrode EEG array. It uses a neural network that can be

trained to recognise complex non-alpha-wave patterns and relationships more

quickly. This means Millán's system works in real time. " You can identify

any pattern you think needs to be translated into a physical action

immediately, " he says.

 

The team is now trying to increase the number of mental states that its

system can recognise. " The larger the number of mental states you have the

more complicated it becomes, " Millán says. " We now need to improve the

learning algorithm in order to differentiate the EEG patterns. "

 

Another grey area, according to Millán, is whether there will be a drop in

the quality of the EEG signals when the user is actually sitting in the

chair as it moves. It is possible that a person's brain activity will become

a lot noisier as they take in their moving surroundings, he says.

Link to comment
Share on other sites

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...