Perceptive Mobile Robots Working Safely Alongside Humans
Seth Teller, Prof. EECS
Description: Although we are still far from the moment of singularity, or even Star Wars 'droids, we can anticipate robot colleagues in the near future, believes Seth Teller. He is developing 'situationally aware' machines to help out humans in those "unstructured environmentswhere we live, work and recreate."
Teller's goal is not "to solve the full AI problem," but to provide robot solutions to specific challenges. Whatever the project, the robot must successfully navigate a messy human world with appropriate sensor data, and interact with us on our terms, through speech and gestures, overcoming potential unease. "We are working with ways of creating natural interactions between humans and robots, paying attention to notions of human acceptance," says Teller.
The first venture Teller describes is an unmanned car, developed for a DARPA competition. Teller's team had to design a vehicle that could not only "see" around itself, but understand the rules and hazards urban driving. Teller shows video of the "Urban Challenge" finals, with his car waiting patiently at an intersection for another car to pull out _"no honking or obscenities," he notes. Someday, believes Teller, such a vehicle could help reduce U.S. driving fatalities, improve gas mileage and human productivity, and even replace thousands of military ground vehicles.
Teller has also been applying the principles of autonomous mobility to logistics in the form of an unmanned forklift for the military. Typical robotic forklifts function in indoor warehouses with smooth floors, uniform lighting, precise maps. Teller's challenge was to come up with a device the "military could set down in a patch of earth somewhere." This forklift robot is equipped not just with laser scanners to detect fixed or moving obstacles, but microphones, so it can stop if it hears a command or shouting. It also displays "text strings and color kinetic LEDs" to let people know where it is going.
Teller is applying this kind of machine intelligence to aid severely disabled people, with a motorized wheelchair that lets users navigate around an institutional setting, learning to map a space using verbal labels from a human trainer. A related assistive technology may offer blind people the possibility of greater independence and efficiency. Teller imagines a device that can "build up a persistent model of the wearer's surround," which could let a blind person know where she left her keys, or send out spoken or braille navigational instructions. Together, these projects point toward machine minds that can increasingly interpret human commands and needs, achieving "validation" from human supervisors and Teller hopes, "a gradual path toward autonomy."
About the Speaker(s): Seth Teller received a Ph.D. from U.C. Berkeley in 1992, focusing on accelerated rendering of complex architectural environments. After post"doctoral fellowships at the Computer Science Institute of the Hebrew University of Jerusalem Institute of Computer Science, and Princeton University's Computer Science Department, Teller joined MIT's Department of Electrical Engineering and Computer Science, Lab for Computer Science, and Artificial Intelligence Lab in 1994. (In 2004, the two labs merged into CSAIL, MIT's Computer Science and Artificial Intelligence Laboratory.)
At CSAIL, Teller heads the Robotics, Vision, and Sensor Networks group (RVSN), where his research focuses on enabling machines to become aware of their surroundings and interact naturally with people.
Host(s): School of Engineering, Transportation@MIT
It looks like no one has posted a comment yet. You can be the first!
More from MIT World — special events and lectures
Added over 2 years ago | 01:56:00 | 2504 views
Added over 2 years ago | 01:05:00 | 5655 views
Added over 2 years ago | 00:37:49 | 11684 views
Added over 2 years ago | 00:09:18 | 1529 views
Added over 2 years ago | 00:56:21 | 2260 views
Added over 2 years ago | 00:56:53 | 1780 views