Robot self-location in unknown environments
It is often necessary for robots to navigate in environments which are not known in advance. In this context, self-location is the problem of determining how far and in what direction motion has occurred. Because of wheel slippage and other errors, odometry cannot be depended upon to provide precise or accurate positional information. Triangulating from visual features can help make position estimation more accurate. This paper describes work that was done during a three-month student research internship at the Center for Engineering Systems Advanced Research (CESAR) of the Oak Ridge National Laboratory, exploring the problem of robot self-location in unknown environments. This work included the development and integration of a set of programs which present a partial solution to this self-location problem. These programs use a sequence of images which are acquired as the camera moves between positions with a motion which is known approximately. Visual features are extracted from the images and matched through time. Triangulation using these features then provides a rough estimate of the range of these features from the camera. Kalman filtering (not implemented) can then be used to integrate the information from odometry and vision to provide a better estimate of the position of the robot. 10 refs., 13 figs., 1 tab.
- Research Organization:
- Oak Ridge National Lab., TN (USA)
- Sponsoring Organization:
- USDOE; USDOE, Washington, DC (USA)
- DOE Contract Number:
- AC05-84OR21400
- OSTI ID:
- 6069884
- Report Number(s):
- ORNL/TM-11718; CESAR-91/03; ON: DE91010450
- Country of Publication:
- United States
- Language:
- English
Similar Records
Wireless Self-powered Visual and NDE Robotic Inspection System for Live Gas Distribution Mains
Autonomous mobile robot research using the HERMIES-III robot