Precisely knowing the robot vehicle's state is essential for all perception, planning and control tasks.
Our so-called 'egomotion estimation' utilizes a bayesian filtering approach to get an optimal estimate of motion, position and attitude. We fuse the measurements of different sensors, while using dynamic models of the vehicle's motion to predict the state.
Depending on the actual robot vehicle, we use a subset of these sensors:
- GNSS receiver (GPS, Glonass)
- inertial measurement unit (IMU), integrated gyroscopes and accelerometers)
- inertial navigation systems (sensor data fusion of GNSS receiver and IMU)
- wheel speed sensors
- engine speed sensors
- optical and radar-based overground speed sensors
- steering angle sensors
- miscellaneous sensors of the basic vehicle
For evaluation, our institute is equipped with different high quality inertial navigation systems from Oxford Technical Solutions (OxTS) and iMAR Navigation. Using RTK-GNSS, we achieve an absolute position accuracy of 1-2cm both on our own testing facilities and in a larger radius around the university.