In contrast to object fusion, which is widespread in industry and fuses tracking results from different sensor types, heterogeneous sensor data fusion uses and fuses not only tracks but also raw and feature data. Depending on the sensor type, it is thus possible to respond better to the requirements of the application and also to obtain an overall better fusion result thanks to the greater information content.
The aim of heterogeneous fusion is to compensate for weaknesses of certain sensor types with strengths of other sensor types in the overall system. This makes it possible to use significantly cheaper components overall. For example, instead of resorting to very expensive radar with good angular resolution, the fusion of low-budget camera and radar can lead to similarly good results overall, as the distance estimation of the radar can be effectively combined with the angle estimation of the camera.
Exemplary tasks are:
- Selection and combination of suitable sensor types (ultrasound, radar, lidar, camera, self-motion)
- Selection and combination of suitable abstraction layers for the requirements of the application
- Mathematical modeling of sensors and objects
- Development of fusion algorithms that utilize all available sensor data as comprehensively as possible to enable highly accurate object detection