We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Durham University

Computer Science


Publication details for Professor Toby Breckon

Payen de La Garanderie, G., Atapour-Abarghouei, A. & Breckon, T.P. (2018), Eliminating the Dreaded Blind Spot: Adapting 3D Object Detection and Monocular Depth Estimation to 360° Panoramic Imagery, Lecture Notes in Computer Science (LNCS) European Conference on Computer Vision. Munich, Germany, Springer.

Author(s) from Durham


Recent automotive vision work has focused almost exclusively
on processing forward-facing cameras. However, future autonomous
vehicles will not be viable without a more comprehensive surround sensing,
akin to a human driver, as can be provided by 360◦ panoramic
cameras. We present an approach to adapt contemporary deep network
architectures developed on conventional rectilinear imagery to work on
equirectangular 360◦ panoramic imagery. To address the lack of annotated
panoramic automotive datasets availability, we adapt contemporary
automotive dataset, via style and projection transformations, to
facilitate the cross-domain retraining of contemporary algorithms for
panoramic imagery. Following this approach we retrain and adapt existing
architectures to recover scene depth and 3D pose of vehicles from
monocular panoramic imagery without any panoramic training labels
or calibration parameters. Our approach is evaluated qualitatively on
crowd-sourced panoramic images and quantitatively using an automotive
environment simulator to provide the first benchmark for such techniques
within panoramic imagery.