Publication details for Professor Toby BreckonAtapour-Abarghouei, Amir, Akcay, Samet, de La Garanderie, Grégoire Payen & Breckon, Toby P. (2019). Generative Adversarial Framework for Depth Filling via Wasserstein Metric, Cosine Transform and Domain Transfer. Pattern Recognition 91: 232-244.
- Publication type: Journal Article
- ISSN/ISBN: 0031-3203 (print)
- DOI: 10.1016/j.patcog.2019.02.010
- Further publication details on publisher web site
- Durham Research Online (DRO) - may include full text
Author(s) from Durham
In this work, the issue of depth filling is addressed using a self-supervised feature learning model that predicts missing depth pixel values based on the context and structure of the scene. A fully-convolutional generative model is conditioned on the available depth information and full RGB colour information from the scene and trained in an adversarial fashion to complete scene depth. Since ground truth depth is not readily available, synthetic data is instead used with a separate model developed to predict where holes would appear in a sensed (non-synthetic) depth image based on the contents of the RGB image. The resulting synthetic data with realistic holes is utilized in training the depth filling model which makes joint use of a reconstruction loss which employs the Discrete Cosine Transform for more realistic outputs, an adversarial loss which measures the distribution distances via the Wasserstein metric and a bottleneck feature loss that aids in better contextual feature execration. Additionally, the model is adversarially adapted to perform well on naturally-obtained data with no available ground truth. Qualitative and quantitative evaluations demonstrate the efficacy of the approach compared to contemporary depth filling techniques. The strength of the feature learning capabilities of the resulting deep network model is also demonstrated by performing the task of monocular depth estimation using our pre-trained depth hole filling model as the initialization for subsequent transfer learning.