fbpx

Photo – Courtesy of Gagadget website

This market research report was originally published at Yole Développement’s website. It is reprinted here with the permission of Yole Développement.

For a while, no Android player included 3D depth cameras in their flagship phones. However, during Mobile World Congress 2022, Honor unexpectedly released the Magic 4 Pro with a 3D depth camera on the front of the phone. Will 3D depth cameras return to Android phones?

Today, in this article, Yole Développement (Yole) will give you the background to 3D depth cameras in Apple and Android phones, their features, and their applications. This will give you an insight into how mobile 3D sensing will play out and will help you further understand its future developments. The battle has been raging on this front as 3D imaging in smartphones was worth $3.0B in 2021, almost half of the overall $6.7B 3D imaging and sensing market as per Yole’s latest report: 3D imaging and sensing – Technology & Market trends (2022 edition coming soon).

Mobile 3D depth sensing era

Apple started using structured light for facial recognition technology in the iPhone X in 2017, ushering in an era of 3D depth imaging in the mobile field.
This structured light system, called the “TrueDepth camera system “, was able to create a specific 3D map of a face, thus enabling the Face ID feature for biometric authentication by just looking at one’s iPhone. The camera system consists of an image sensor, a dot projector, and a flood illuminator. So, how does this work? Every time a user glances at an iPhone X, the TrueDepth camera system first detects the face with a flood illuminator, even in the dark. Then a dot projector projects over 30,000 invisible infrared dots, and an infrared image sensor detects the reflected image. The system uses the reflected infrared image and information about the projected infrared dots and puts them through neural computing network to create a mathematical model of the face. Once it has a 3D map of a face secured, it can be used as a reference for the Face ID feature: only that face is able to unlock your phone. As the system has been biometrically certified, it also allows new applications, like making payments via Apple Pay or other third-party apps.

This amazing way of unlocking the phone in a contactless way caused a great sensation in the smartphone market.

Within the next year, in 2018, Android players Oppo, Huawei, and Xiaomi also launched front 3D depth cameras, using very similar structured light technologies to Apple. A poor user experience combined with high cost forced these players to explore other options.

The Android camp attempted to use another 3D imaging technology, indirect Time of Flight (iToF). It was used for rear 3D depth cameras, for quick focus and imaging bokeh and some highly anticipated AR games and other applications.

The hardware for this technique is more compact than structured light, requiring only a ToF sensor chip, and a flood illuminator. The distance is computed by the time difference between emission and reception. Compared to structured light, it does not need much computing power, software integration is relatively simple, and overall, it has cost advantages.

LG, Samsung and Huawei used this kind of technology both for front and/or rear implementations.

Current status of 3D depth camera in mobile

However, these products did not attract as much consumer attention as the iPhone X’s face ID when it was introduced to the market. Another technology for unlocking mobile phones, the optical fingerprint under the screen, has made significant breakthroughs. Compared to the Face ID solution, it is cheaper and easier to use, and soon dominated most of the Android market.

The 3D depth camera placed on the back of the phone was expected to be used in AR, but the AR applications were simple and AR games were very few, so it did not help to drive 3D depth camera growth. Even so, Huawei, as a leader in 3D sensing in mobile, spared no effort in promoting the solution, while other OEMs held back, waiting for consumer adoption.

At that time, the market remained optimistic, looking forward to further developments, expecting Apple to join.

Unfortunately, the ban on Huawei due to the conflict between China and the US resulted in Huawei being unable to purchase the most advanced chips and thus moving ahead of its rivals in cameras. As a result, momentum for 3D depth cameras plummeted.

In 2020, Apple launched an innovative consumer LiDAR using a new kind of ToF technology called direct ToF. Compared with the previous time of flight technology (indirect ToF) It promises better performance at a long distance while keeping the cost under control. Unexpectedly, Apple hasn’t come up with a new use case or killer application – as expected – along with the new hardware, which could have driven the market even higher. So far, in mobile 3D depth sensing, Apple is pretty much alone and far ahead of the competition as Android OEMs seemed to have abandoned the space altogether.

Will 3D depth cameras return to Android phones?

During the MWC this year, Honor unexpectedly released a 3D depth camera on the front of its flagship phone, the Magic 4 Pro.

This has attracted much attention. It is speculated that when Honor separated from Huawei in November 2020, it inherited many things from Huawei, such as technologies, expertise and materials; thus it was easy and natural to continue developing 3D sensing to add premium points and differentiated features to its flagship phones. As the Apple approach showed, having a specific 3D sensing camera is a differentiating hardware in ultra-premium phones. It is highly probable that a whole category of consumers, being used to Apple phones, will never scale down to fingerprints. In the context of Apple gaining market share vis-a-vis Android phones this is not an issue, because this kind of consumer just does not exist. But the very fact that Honor has started to implement this hardware is a sign it might be targeting these Apple users. Other Android OEMs may go this same route as some key enabling technology such as under-display 3D is now around the corner.

From a longer-term perspective, with all the Metaverse stories we have been hearing about since Q4 2021, we can expect further developments in human machine interactions, increasing the imaging and sensing capabilities of existing products, both from a technology point of view and an application point of view. Hence, implementing such enabling hardware in mature smartphones could be a good way to get ready for the Metaverse. Apple is leading the charge in 3D sensing; the Android OEMs may finally decide to follow again, both on the front and rear side.

Richard Liu
Technology and Market Analyst, Imaging and Display
Photonics, Sensing and Display Division, Yole Développement.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top