CES 2026: Physical AI moves from concept to system architecture

This market analysis was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group.

The world’s largest consumer electronics conference demonstrated the technical synergies between automotive and robotics.
At CES 2026, there was a clear cross-sector message: Physical AI is the common language across the automotive, robotaxi and humanoid robotics segments, built on a shared set of technologies, compute platforms, and design assumptions.

The term “physical AI” has emerged to encompass systems that can perceive complex, dynamic environments, reason under uncertainty, make decisions with real-world consequences, and act safely through physical embodiment. It can be applied to a car driving on a highway, a robotaxi operating in dense urban traffic, or a humanoid robot manipulating objects or interacting with humans.

Perception, long viewed as the bottleneck, is now considered broadly “good enough,” thanks to multi-modal sensor fusion combining cameras, radar, LiDAR and tactile sensors, alongside large-scale data training and foundation perception models that enable systems to understand scenes, recognize intent, and track dynamic agents.

Pierrick Boulay, Principal Analyst, Automotive Semiconductors at Yole Group

In this article, Pierrick Boulay, Principal Analyst, Automotive Semiconductors at Yole Group, shares his key takeaways from the show and explains why the convergence between cars and robots is accelerating. This analysis is grounded in Yole Group’s long-standing automotive semiconductor expertise and part of a dedicated automotive collection, which delivers clear, technology-driven insight into architectures, supply chains, and competitive dynamics shaping mobility.

It also reflects Yole Group’s expanding focus on robotics: step by step, analysts are developing a core expertise in the robot domain to bridge emerging applications, especially humanoids, with the innovative semiconductor solutions that will enable them.

As the market moves from prototypes to early industrial deployments, Yole Group introduces its new report Humanoid Robots 2026, providing a valuable analysis of this growing business opportunity and the technology roadmap behind it.

Humanoids: From demos to factory floors

Building on the theme of growing AI maturity, humanoid robotics vendors demonstrated that the technology is becoming production ready for industrial deployment, where ROI is clearer and operating environments are controlled.

Unlike consumer settings, factories offer controlled environments where humanoids can deliver immediate productivity gains, making them the first credible commercial deployment for physical AI beyond vehicles.

A broad lineup of humanoid robots included platforms from Unitree, NEURA, Fourier, LG, and others. Boston Dynamics confirmed that its Atlas is slated for real factory roles, with deployment planned at Hyundai’s EV manufacturing facility in Georgia by 2028.The market emerging around humanoid robots is dynamic, with Yole projecting a 56% CAGR to reach more than $6 billion by 2030. As adoption accelerates at the start of the next decade, the value could soar to $51 billion by 2035.

Meanwhile, consumer humanoids remain constrained by cost, reliability, energy efficiency, and safety, despite early demonstrations from the likes of LG and Hisense.

Humanoids will only reach the consumer market when prices fall as the technology matures and manufacturing scales to mass production.

The convergence of humanoid and automotive technologies is striking. Humanoid robotics is increasingly inheriting the automotive compute stack, benefiting from years of investment into safety-certified AI SoCs originally designed for ADAS and autonomy. Texas Instruments’ latest TDA5 SoC family, like Tesla’s FSD and Xpeng’s Turing AI chips, was originally designed for automotive applications but is also well suited to humanoids.

From a compute standpoint, humanoid robots and autonomous vehicles share similar requirements: multi-sensor ingestion, real-time perception and fusion, low-latency AI inference, deterministic control, and safety-critical behaviour around humans. The key difference lies in actuation, steering and braking in vehicles versus the motion control of humanoid limbs and hands.

This overlap is driving cross-sector synergies, from shared silicon platforms to Mobileye’s acquisition of Mentee Robotics announced in the closing keynote.

Automotive architecture is being rebuilt around AI

The innovations on show at CES emphasized that AI and compute platforms are now the primary design axis of vehicles, rather than powertrains, sensors, or even traditional E/E architectures.

NVIDIA’s announcements underscored this shift. The AI leader introduced an open reasoning VLA model designed to tackle long-tail autonomous driving challenges, alongside simulation tools and datasets under its open-source Alpamayo platform. Industry players including Mercedes-Benz, JLR, Lucid, Uber, and the AV research community including Berkeley DeepDrive, are using these tools to accelerate their roadmaps for reasoning-based autonomy.

Autonomy becomes an ecosystem problem

Importantly, the tone at CES reflected a more mature framing of the expected timelines for fully autonomous driving. Talk of “full autonomy soon” based on promises of universal Level 4/5 functionality has given way to a more realistic focus on Level 2+ commercialization, with carefully framed roadmaps leading to the initial rollout of Level 3 autonomy in the late 2020s.

There are clear reasons for this shift. Level 2 autonomy benefits from regulatory acceptance, immediate customer value, opportunities for monetization through optional features and subscriptions, and the generation of massive real-world datasets.

Level 3 announcements are now emerging from US OEMs indicating timelines around 2028, creating renewed opportunities for non-Chinese LiDAR suppliers, with Valeo expected to feature prominently in design wins.

CES 2026 highlighted a cautious optimism: partnerships, AI innovation, and strategic alliances may finally deliver on the long-promised vision of self-driving vehicles, but widespread adoption remains a work in progress.

Pierrick Boulay, Yole Group

At the same time, vendors normalized the idea that autonomy is too complex for vertical isolation, given the scale of data, validation and regulatory requirements involved.

The ecosystem of mobility platforms, OEMs, chipmakers and software companies will need to work together to achieve adoption. Collaborations such as Lucid Motor teaming up with Nuro and Uber, VW with MOIA and Mobileye, and multiple collaborations with NVIDIA, reflect this reality.

Yole Group will monitor these emerging trends throughout 2026, as AI becomes scalable and system-defined, built around integrated architectures capable of reasoning and decision-making in the physical world. Follow Yole Group throughout 2026 as analysts track the technologies shaping automotive, robotics, and Physical AI.

Explore Yole Group’s 2026 program and stay tuned for new insights, articles, and reports.


About the author

Pierrick Boulay is Principal Analyst, Automotive Semiconductors at Yole Group.

He works in the fields of solid-state lighting and lighting systems, carrying out technical, economic, and marketing analyses. In addition, he leads the automotive activities within the company.

Pierrick has authored several reports and custom analyses on topics such as automotive lighting, LiDAR, sensing for ADAS vehicles, and VCSELs.

Prior to Yole Group, Pierrick has worked in several companies where he developed his knowledge of lighting and automotive. In the past, he has primarily worked in R&D departments on LED lighting applications.

Pierrick holds a master of science in electronics at ESEO (Angers, France).

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top