Performance of visual and ultrasound sensing by an autonomous robot
This paper presents results of an experimental study of the reliability of an autonomous mobile robot operating in an unstructured environment. Examined in the study are the principal components of the visual and ultrasound sensor systems used to guide navigation and manipulation tasks of the robot. Performance criteria are established with respect to the requirements of the integrated robotic system. Repeated measurements are done of the geometric and spatial quantities used for docking the robot at a mock-up control panel, and for locating control panel devices to be manipulated. The systematic and random components of the errors in the measured quantities are exhibited, their origins are identified, and means for their reduction are developed. We focus on refinements of visual area data using ultrasound range data, and on extraction of yaw by visual and by ultrasound methods. Monte Carlo methods are used to study the sensor fusion, and angle-dependence considerations are used to characterize the precision of the yaw measurements. Issues relating to sensor models and sensor fusion, viewed as essential strategic components of intelligent systems, are then discussed. 32 refs., 13 figs., 5 tabs.
- Research Organization:
- Oak Ridge National Lab., TN (USA)
- Sponsoring Organization:
- USDOE; USDOE, Washington, DC (USA)
- DOE Contract Number:
- AC05-84OR21400
- OSTI ID:
- 6019821
- Report Number(s):
- ORNL/TM-11733; CESAR-90/54; ON: DE91010452
- Country of Publication:
- United States
- Language:
- English
Similar Records
Proceedings of the 1989 CESAR/CEA (Center for Engineering Systems Advanced Research/Commissariat a l'Energie Atomique) workshop on autonomous mobile robots (May 30--June 1, 1989)
Advances in concurrent computers for autonomous robots