Sensor data fusion for lateral safe applications

other
This paper describes the algorithms that are being developed for the perception layer of the PReVENT subproject LATERAL SAFE. These algorithms aim at achieving a reliable
representation of the objects and their kinematics, present at the lateral and rear field of the ego-vehicle. The work presented in this paper is within the fields of radar tracking, sensor network processing, image and stereo vision processing, and integration and fusion of sensorlevel processed data. The perception layer of LATERAL SAFE is a distributed sensor-level fusion system that processes in a central level the tracks of four tracking systems: a rear looking long range radar, two lateral short range radar networks and a system of lateral and rear looking cameras.
TNO Identifier
221354
Source title
13th World Congress & Exhibition on Intelligent Transport Systems and Services, 12 October 2006, ExCel London, United Kingdom.
Pages
1-12
Files
To receive the publication files, please send an e-mail request to TNO Repository.