Temporal Depth Sensor Fusion with Multiple Modalities
Contact Persons
HyunJun Jung (hyunjun.jung@tum.de), Benjamin Busam (b.busam@tum.de), Nikolas Brasch (nikolas.brasch@tum.de)
Project Coordination: Nikolas Brasch (nikolas.brasch@tum.de)
Abstract
Time of Flight (ToF) sensing is one of the commonly used ways to estimate the depth of the surroundings. However, ToF modality often suffers from artefacts such as 1) Shot noise and missing information due to the power limitation of the device (especially on mobile phones) and 2) Depth error due to scattering and interference of the emitted signal. The goal of our work is to use the corrupted raw signal from the ToF camera and fuse it with other modalities, such as RGB image and IMU, to fill up regions of missing information and fix range errors by using neural networks.
Keywords: 3D Computer Vision, Depth Estimation, Sensor Fusion
Research Partner
Huawei Technologies Research & Development (UK) Ltd
Location
Campus Garching, Room number: 03.13.060