Updates & News

Find out the latest and greatest in wireless accessibility.

 
RSS
Back to All News

A Self-Driving Bus That Can Speak Sign Language

May 23, 2017

Local Motors and IBM are equipping an autonomous electric shuttle bus with technology that assists people with a range of disabilities.

It’s been 15 years since a degenerative eye disease forced Erich Manser to stop driving. Today, he commutes to his job as an accessibility consultant via commuter trains and city buses, but he has trouble locating empty seats sometimes and must ask strangers for guidance.

A step toward solving Manser’s predicament could arrive as soon as next year. Manser’s employer, IBM, and an independent carmaker called Local Motors are developing a self-driving, electric shuttle bus that combines artificial intelligence, augmented reality, and smartphone apps to serve people with vision, hearing, physical, and cognitive disabilities. The buses, dubbed “Olli,” are designed to transport people around neighborhoods at speeds below 35 miles per hour and will be sold to cities, counties, airports, companies, and universities. If the buses enter production in summer 2018, as planned, they could be among the earliest self-driving vehicles on U.S. roads.

Since Olli is fully autonomous and does not have a human driver, it uses IBM’s AI-powered Watson technology to converse with passengers (via voice and text displayed on an iPad). Olli navigates using radar, lidar, and optical cameras from a company called Meridian Autonomous. Before deploying in a neighborhood, Meridian Autonomous constructs 3-D maps of the area that Local Motors says are accurate to the half-inch. A human fleet manager then determines the bus route. When Olli detects an emergency via its various sensors, it will stop, notify a (human) remote supervisor, and independently run through a checklist of possible problems. “If a passenger has a medical problem or [there’s a safety issue], Olli will call the authorities or drive itself to a hospital or police station,” says Gina O’Connell, a Local Motors general manager who is leading the project.

Local Motors and IBM started collaborating on Olli in early 2016 and produced a first iteration of the bus in June 2016. That vehicle is currently in trials in Germany and Switzerland. It is the next—second—generation of Olli that will include assistive technologies. That version, which the companies call “Accessible Olli,” will be manufactured starting in 2018, and will retain Watson as a tool for communicating with passengers and add additional Watson features.

Local Motors and IBM are still testing technologies, but have already identified some capabilities they are likely to add. Future Ollis, for example, might direct visually impaired passengers to empty seats using machine vision to identify open spots, and audio cues and a mobile app to direct the passenger. Olli could also guide passengers via a special type of haptic feedback that uses ultrasound to project sensations through the air. An array of haptic sensors could be designed into every seat, and when people walk down the aisle they would feel a vibration on their hand or arm to alert them that they were at an empty seat, explains Drew LaHart, the program director for IBM’s accessibility division. 

For deaf people, the buses could employ machine vision and augmented reality to read and speak sign language via onboard screens or passengers’ smartphones. LaHart says that Olli could be trained to recognize sign language using machine learning and Watson’s image recognition capabilities. If the bus were equipped with AR technology, it might be able to respond via a hologram of a person signing.

Machine vision could also enable Olli to recognize passengers waiting at bus stops who have walkers and wheelchairs. The bus would then activate an automated ramp to help them board and then deploy equipment that would secure their assistive devices, locking a wheelchair into place, for example.

Full Story