Wheelchair with Multimodal Interfaces
Andrea Bonarini1 , Simone Ceriani1 , Giulio Fontana1 , and Matteo Matteucci1
Abstract— The LURCH project aims at the development of an autonomous wheelchair capable of avoiding obstacles, selflocalize and explore indoor environments in a safe way. To meet disabled people requirements, we have designed the user interface to the autonomous wheelchair in such a way that it can be simply modified and adapted to the users needs. In particular, the user has the opportunity to choose among several autonomy levels (from simple obstacle avoidance to complete autonomous navigation) and different interfaces: a classical joystick, a touch-screen, an electro miographic interface, and a brain-computer interface (BCI), i.e., a system that allows the user to convey intentions by analyzing brain signals.
I. I NTRODUCTION
The possibility of moving in an autonomous way gives individuals a remarkable physical and psychological sense of well-being. Electric wheelchairs are usually driven by a joystick and are addressed to those people that are not able to apply the necessary force to move a manual wheelchair.
However, often they cannot be used by people with low vision, visual field reduction, spasticity, tremors, or cognitive deficits. In order to give also to these people a higher degree of autonomy, and to lighten the duties of those who assist them, a large number of solutions have been studied by researchers since the 1980s, by using technologies originally developed for mobile robots to create the so called smart wheelchairs. A smart wheelchair, or autonomous wheelchair, typically consists of either a standard powered wheelchair to which a computer and a collection of sensors has been added or a mobile robot base to which a seat has been attached.
One of the first examples of autonomous wheelchairs was proposed in [1], who equipped a wheelchair with sonars and a vision system to identify landmarks