Guillaume Girardin, Division Director for photonics, sensing and displays at Yole Développement, presents the “From 2D to 3D: How Depth Sensing Will Shape the Future of Vision” tutorial at the May 2018 Embedded Vision Summit.
For several decades, 3D imaging and sensing technologies have matured, thanks to extensive, successful deployments in high-end applications, mainly in medical and industrial markets. More recently, as 3D sensor costs have dropped, 3D imaging and sensing devices have become a significant business, generating almost $2B in revenue in 2017, with strong growth prospects driven by consumer devices such as the iPhone X with its TrueDepth camera module.
Currently there are three main depth sensing approaches on the market: stereo vision, structured light and time-of-flight. Each of these approaches has pros and cons, from a technical point of view and with respect to cost. Girardin explains the key strengths and weaknesses of each of these technologies and analyzes the cost of adding them to a system based on his firm’s recent teardown analyses. He also gives an overview of this depth sensing sensor market and the main factors driving it, and explores how this market could shape the future of vision.