Missing Depth Data In Painting

Authored by: Ju Shen , Sen-ching Samson Cheung , Chen Chen , Ruixu Liu

Encyclopedia of Image Processing

Print publication date:  November  2018
Online publication date:  November  2018

Print ISBN: 9781482244908
eBook ISBN: 9781351032742
Adobe ISBN:


 Download Chapter



With the remarkable development of active depth-sensing technology, red, green, blue-depth (RGB-D)-based systems have been widely adopted in different fields, such as virtual reality and telecommunication on mobile devices. For a given depth image, a depth value is provided per pixel, which can significantly facilitate the three-dimensional (3D) estimation procedure and enable many applications from 3D scanning to motion tracking to activity recognition. Despite its potential uses and popularity, the quality of the depth measurements of modern depth sensors is far from perfect. There are often many pixels with erroneous values or missing values in the acquired depth map. The uncertainty in depth measurements among these sensors can significantly degrade the performance of any subsequent vision processing. One possible solution is to infer or correct the measured pixel values by using imaging in-painting technique. However, due to the different nature of RGB and depth acquisition, traditional in-painting methods often yield unsatisfactory results on the depth map. In this entry, we will take the Kinect (v1) sensor as an example and introduce a probabilistic model to capture various types of uncertainties in the depth measurement process among structured light systems. The key idea is to utilize the correlation between color and depth channels to classify scene objects from the depth image to different layers according to their distance to the camera. Then these layers are used to guide the inference procedure of those missing or erroneous pixels.

Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.