In this paper a multisensor setup for localization consisting
of a 360 degree laser range finder and a monocular vision
system is presented. Its practicability under conditions of
continuous localization during motion in real-time (re-
ferred to as on-the-fly localization) is investigated in large-
scale experiments. The features in use are infinite horizon-
tal lines for the laser and vertical lines for the camera pro-
viding an extremely compact environment representation.
They are extracted using physically well-grounded models
for all sensors and passed to the Kalman filter for fusion
and position estimation. Very high localization precision is
obtained in general. The vision information has been found
to further increase this precision, particular in the orienta-
tion, already with a moderate number of matched features.
The results were obtained with a fully autonomous system
where extensive tests with an overall length of more than
1.4 km and 9,500 localization cycles have been conducted.
Furthermore, general aspects of multisensor on-the-fly lo-
calization are discussed.