fbpx

The Rise and Fall of the ADAS Promise Now Disrupted by AVs

This market research report was originally published at Yole Développement’s website. It is reprinted here with the permission of Yole Développement.

Advanced driver assistance systems (ADAS) have been developed for more than ten years now, in pursuit of increased safety in the world of automobiles. Combining a set of sensors, mostly radars and cameras combined with powerful ECUs, the technology offered promising breakthroughs at the turn of the last decade. By 2012, several academic research papers had proved that Automated Emergency Braking (AEB) was reducing rear end crashes by 40% and had the ability to reduce associated fatalities by 15%. This was the time the idea of “vision zero” spread out globally. High technology could do something about people’s lives and the myth of ADAS was inflated beyond hype.

Indeed, another story grew behind the ADAS acronym, and two companies epitomize the switch in defining the goals of ADAS: Mobileye and Tesla. At the time, a totally new angle to ADAS was brought to the table, and this was autonomy. This rupture in approaching ADAS actually caused the stir between the two companies when J. Brown died while under “Autopilot” in 2016 in a Tesla Model S using a Mobileye EyEQ3 based ADAS system. The rest is history. Ten-months later, Mobileye was acquired by Intel for $15.3B and it was then on a trajectory to power the autonomous vehicles of the world with its $50 SoCs. By 2020, for most of the industry ADAS means the technology which helps climbing the autonomy ladder defined by SAE engineers in 2014 – despite lack of proof it ever was able to climb a single step. ADAS was the technology and was rewarded by pompous performance “Level 2”, “Level 2+”, and now “Level2++”. In terms of autonomy maybe the reality is somewhere else.

As said by Yole Développement’s (Yole) analysts in the Sensors for Robotic Mobility 2020 report: In 2020, we are more than ten years down the line in terms of the development of key ADAS features like AEB. Attachment rate has increased significantly in the industry, currently beyond 30%, and AEB will become somewhat mandatory by 2022. The safety performance has not changed much though, and the US regulatory body IIHS, convening in June 2019, confirmed that AEB was reducing rear-end crashes by 50% and reducing associated claim rates for injuries by only 23%… In the same year an AAA test showed major limitations in pedestrian detection for many situations, such as with children, groups, in turns, or at night. The safety endeavor seems to have been left aside despite ten years of technology upgrades. The reason should have been because it does better on the autonomy aspects, though unfortunately it is not so. The big step everybody in the industry was waiting for was “level 3”. The ability to remove the eyes from the road. This level was defined as the entry into autonomy and, Tesla aside, Audi was the first company to give it a try with the 2018 A8 but is giving up in 2020. This was a great lesson for the other autonomous warriors out there. Lexus is currently branding a similarly technology-loaded LS with “Level 2” status.

So how one can explain the gap between ADAS expectation and delivery? Is it the sensor set performance? Do we need more sensors, such as Lidars and thermal cameras? Do we need more computing power? Or is it a combination of all?

The answer is indeed somewhere else. While many in the automotive industry are claiming autonomous vehicles are not for tomorrow and probably just a fantasy, more than 4,000 autonomous vehicles are currently roaming our streets around the world, mostly for research and development purposes. By the end of 2019, robo-taxies have become a reality as Waymo launched its “Waymo one” service. It currently uses about one fourth of its ~600 vehicle fleet servicing real customers with a driverless taxi service, while getting ready to retrofit 10s of thousands of vehicles per year when opening its Michigan factory. Waymo is not alone in this story. General Motors owned Cruise, Lyft backed Aptiv, Baidu the Apollo program, Amazon backed Zoox, Ford and VW backed Argo.ai, Alibaba backed Autox, Toyota backed Pony.ai. More than 20 companies around the world have entered the fray and have truly autonomous vehicles on the road. This picture would be forgetting Uber and maybe other minor players. The tragic death of E. Hertzberg in March 2018 showed that reckless reliance on human monitoring can be deadly in the AV space. It is surely easy to have a somewhat autonomous vehicle on the road, but few companies have reached the level of confidence for true autonomy. Waymo is probably the first of the kind in giving us a benchmark of “the right stuff” one needs for autonomy.

It all starts with a sensor set providing in the order of 10 times the data of current ADAS systems, using 2 to 3 times more sensors and also 2 to 3 times more resolution. On the computing power side, we are looking at 100 times more power. While typical ADAS systems using Intel-Mobileye chips were making the leap between 0.25 TOPs (10 times a high-end laptop) and 2.5Tops for their new EyeQ4 chip, the robotic cars where already beyond 250TOPs. They are worlds apart. When a “Level 2” ADAS system would cost in the range of US$400, robotic retrofit equipment would cost US$130 thousands, more than 300 times the ADAS figure. And still people question the performance of such robotic setups to drive better than the average human. ADAS players could eventually climb the autonomy ladder but it will cost more money, more time… and more innovation. As with every disruptive technology, robo-taxies are overlooked by the incumbents. Only emerging players, such as Tesla, are ready to invest massive amounts of money to develop their own FSD chip, which only reaches 70TOPS. This is a nice Gemini capsule, but not yet an Eagle lander. Despite all the hype in ADAS, the real AV disruption is looming, and Amazon and Alibaba are probably the biggest threat to the VWs and Toyotas of this world, despite all the Teslas and XPengs bragging about AV rights.

Even if ADAS vehicles are climbing the steps of SAE levels, the E/E architecture used by most OEMs has not changed since the 1960s and the beginning of electrical architecture. A distributed E/E architecture using the “One ECU, One Function” principle is still in place. Over time, more sensors were implemented, enabling more functions for safety, sensing, powertrain, infotainment or comfort, and high-end cars can embed up to 150 ECUs. The result is that adding more ECUs result in an increasingly complex, heavy, and expansive wiring harnesses. To climb the steps, OEMs will have to change this architecture by creating domain-specific controllers related to the five domains already mentioned. This is what Audi started to do with the zFAS domain controller, where the forward ADAS camera, the long-range radar, and the LiDAR, among other sensors, are linked. In this controller, all the processing related to these sensors can be found with an EyeQ3 vision processor, SoCs from Altera and Nvidia and an MCU from Infineon. With its Autopilot hardware, Tesla is doing it at another order of magnitude with a domain controller composed of eight electronic boards and three times more components than the zFAS. Besides being able to fuse the large amount of raw data coming from sensors, the Autopilot hardware can control audio, navigation, and RF communication, and on top of that, enable Over-The-Air updates. It was easier for Tesla to develop such architecture as they were starting from a blank page, but if traditional OEMs want to enable automated driving features, they will have to move toward a more centralized E/E architecture.

According to the Sensing and Computing for ADAS vehicle 2020 report from Yole Développement: For ADAS vehicles, the sensor and computing market revenue are expected to reach US$8.6 billion in 2020. This market is driven by radars and camera modules with US$3.8 billion and US$3.5 billion, respectively, as they are essential for AEB systems. The system is completed by the computing for ADAS systems with revenue of US$1.3 billion, and LiDAR, which is insignificant. This market is expected to grow at a CAGR of 21% between 2020 and 2025 to reach more than US$22 billion. As the penetration rate of AEB systems is expected to continue its increase, revenues related to radar and camera modules should reach US$9.1 billion and US$8.1 billion, respectively. The computing for ADAS should reach US$3.5 billion and revenue related to LiDAR, which will only be implemented in high-end cars, could reach US$1.7 billion. On the robotic vehicles side, revenues associated with camera, radar and LiDAR are expected to reach US$145 million in 2020 with LiDAR generating US$107 million itself, followed by cameras and radars with US$27 million and US$11 million, respectively. In 2025, revenue generated by these sensors is expected to reach US$956 million at a CAGR of almost 46%. LiDAR will generate US$639 million, followed by cameras and radars with US$228 million and US$89 million, respectively. At the same time, the annual production of robotic vehicles will grow from 4,000 units today to 25,000 units, including robo-taxies and shuttles.

To conclude, a lot of hope has been put on ADAS vehicles to enable automated driving. However, OEMs have now realized that making ADAS vehicles autonomous is much more complex than expected, as different types of sensors have to be integrated, high amounts of data has to be processed and crashes are forbidden. In the meantime, robo-taxies have become a reality, powered by tech giants able to process a huge amount of data with chips reaching more than 250TOPS. Robo-taxies are now disrupting ADAS vehicles and tech giants will soon enter the ADAS world. Waymo has already partnered with Fiat-Chrysler, Renault-Nissan and, more recently, Volvo, to develop robo-taxies. Other similar partnerships should follow in the near future.

Pierre Cambou
Principal Analyst, Photonics and Sensing Division, Yole Développement

Pierrick Boulay
Market and Technology Analyst, Photonics and Sensing Division, Yole Développement

 

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top