Robotics Applications for Embedded Vision

Texas Instruments, D3 Embedded, Lattice and NVIDIA Show a Practical Radar-Camera Fusion Stack for Robotics
TI’s new application brief and companion demo outline how mmWave radar, camera input, FPGA-based sensor bridging and NVIDIA Holoscan can be combined into a low-latency perception pipeline for humanoids and other autonomous machines. Texas Instruments, D3 Embedded, Lattice Semiconductor and NVIDIA are outlining a concrete radar-camera fusion stack for robotics rather than just talking
Building Robotics Applications with Ryzen AI and ROS 2
This blog post was originally published at AMD’s website. It is reprinted here with the permission of AMD. This blog showcases how to deploy power-efficient Ryzen AI perception models with ROS 2 – the Robot Operating System. We utilize the Ryzen AI Max+ 395 (Strix-Halo) platform, which is equipped with an efficient Ryzen AI NPU and

Humanoid Robots 2026-2036: Technologies, Markets, and Opportunities
Maturity of commercialization of humanoid robotics by application. For full data, refer to IDTechEx’s research on “Humanoid Robots 2026-2036: Technology, Market, and Opportunities” This blog post was originally published at IDTechEx’s website. It is reprinted here with the permission of IDTechEx. Automotive industry, logistics, home-use, key players and suppliers, AI chip, battery, actuator, motor, screw, tactile

2026: The Year Intelligence Gets Physical
This article was originally published at Analog Devices’ website. It is reprinted here with the permission of Analog Devices. Artificial intelligence is entering a new phase where models interpret contextual data whilst interacting with the physical world in real time. At Analog Devices, Inc. (ADI), we call this Physical Intelligence: intelligent systems that can perceive, reason

STMicroelectronics and Leopard Imaging Accelerate Robotics Vision with NVIDIA Jetson-ready Multi-sensor Module
Key Takeaways Multimodal module combining 2D imaging, 3D depth sensing, and human-like motion perception NVIDIA Holoscan Sensor Bridge ensuring multi-gigabit plug and play connectivity with Jetson platforms Fully supported by NVIDIA Isaac open robot development platform STMicroelectronics and Leopard Imaging® have introduced an all-in-one multimodal vision module for humanoid and other advanced robotics systems. Combining

Lattice Joins NVIDIA Halos Ecosystem to Advance Safety for Physical AI with Holoscan Sensor Bridge
HILLSBORO, Ore. – Mar. 16, 2026 – Lattice Semiconductor (NASDAQ: LSCC), the low power programmable leader, today announced it has joined the NVIDIA Halos AI Systems Inspection Lab ecosystem, the first ANSI National Accreditation Board (ANAB) accredited inspection lab for AI-driven physical systems. Announced at the NVIDIA GTC 2026, Lattice will engage with NVIDIA and other Halos ecosystem members to

NVIDIA and Global Robotics Leaders Take Physical AI to the Real World
News Summary: Physical AI leaders across robot brain developers, industrial, and surgical robot giants and humanoid pioneers including ABB Robotics, AGIBOT, Agility, CMR Surgical, FANUC, Figure, Hexagon Robotics, KUKA, Medtronic, Skild AI, Universal Robots, World Labs and YASKAWA are building on NVIDIA technology to develop and deploy physical AI at scale. NVIDIA unveils new NVIDIA

NXP Delivers New Innovations for Advanced Physical AI with NVIDIA
Key Takeaways: Secure, reliable real-time data processing and transport solutions for next-generation physical AI applications, developed in collaboration with NVIDIA NVIDIA humanoid robotics solutions integrated into NXP’s safe, secure edge portfolio cut development costs and speed time to market First in a series of NXP’s foundational robotics solutions designed to accelerate physical AI development and

Arduino Announces Arduino VENTUNO Q, Powered by Qualcomm Dragonwing IQ8 Series
The new platform by the leading open-source hardware provider is purpose-built for generative AI, robotics, and actuation — making advanced capabilities accessible to all. Ahead of Embedded World, Arduino announced the upcoming launch of its newest platform to democratize edge AI, Arduino® VENTUNO™ Q. Named after the Italian word for twenty-one, VENTUNO Q builds on the iconic

Conversations at the Edge with NXP
This blog post was originally published at Au-Zone’s website. It is reprinted here with the permission of Au-Zone. Are Single-Sensor Robots Obsolete? We think so, and we’re here to show you why. Au-Zone is proud to be featured in NXP Semiconductors’ Conversations at the Edge video series, a multi-part collaboration exploring innovation at the intersection of

TI Accelerates the Next Generation of Physical AI with NVIDIA
News highlights: TI and NVIDIA are collaborating to accelerate the path from simulation to the safe deployment of humanoid robots in the real world. As part of this collaboration, TI integrated its mmWave radar technology with NVIDIA Jetson Thor and NVIDIA Holoscan to enable low-latency 3D perception and safety awareness for physical AI applications. TI

Which Service Robots Will Dominate the Market in the Next 10 Years?
Logistics robots and cleaning robots both benefit from high market demand and relatively low technical barriers, compared to kitchen and restaurant robots or underwater robots. Source: Service Robots 2026-2036: Technologies, Players and Markets This blog post was originally published at IDTechEx’s website. It is reprinted here with the permission of IDTechEx. The service robotics industry has grown

CES 2026: Physical AI moves from concept to system architecture
This market analysis was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. The world’s largest consumer electronics conference demonstrated the technical synergies between automotive and robotics. At CES 2026, there was a clear cross-sector message: Physical AI is the common language across the automotive, robotaxi

Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems
This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor’s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety

What Sensor Fusion Architecture Offers for NVIDIA Orin NX-Based Autonomous Vision Systems
This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Key Takeaways Why multi-sensor timing drift weakens edge AI perception How GNSS-disciplined clocks align cameras, LiDAR, radar, and IMUs Role of Orin NX as a central timing authority for sensor fusion Operational gains from unified time-stamping

Production Software Meets Production Hardware: Jetson Provisioning Now Available with Avocado OS
This blog post was originally published at Peridio’s website. It is reprinted here with the permission of Peridio. The gap between robotics prototypes and production deployments has always been an infrastructure problem disguised as a hardware problem. Teams build incredible computer vision models and robotic control systems on NVIDIA Jetson developer kits, only to hit

Robotics Builders Forum offers Hardware, Know-How and Networking to Developers
On February 25, 2026 from 8:30 am to 5:30 pm ET, Advantech, Qualcomm, Arrow, in partnership with D3 Embedded, Edge Impulse, and the Pittsburgh Robotics Network will present Robotics Builders Forum, an in-person conference for engineers and product teams. Qualcomm and D3 Embedded are members of the Edge AI and Vision Alliance, while Edge Impulse

Faster Sensor Simulation for Robotics Training with Machine Learning Surrogates
This article was originally published at Analog Devices’ website. It is reprinted here with the permission of Analog Devices. Training robots in the physical world is slow, expensive, and difficult to scale. Roboticists developing AI policies depend on high quality data—especially for complex tasks like picking up flexible objects or navigating cluttered environments. These tasks rely
