APPLICATIONS

“Rapid Development of AI-powered Embedded Vision Solutions—Without a Team of Experts,” a Presentation from Network Optix

Marcel Wouters, Senior Software Engineer at Network Optix, presents the “Rapid Development of AI-powered Embedded Vision Solutions—Without a Team of Experts” tutorial at the May 2025 Embedded Vision Summit. In this presentation, Wouters shows how developers new to AI can quickly and easily create embedded vision solutions that extract valuable information from camera streams. He […]

“Rapid Development of AI-powered Embedded Vision Solutions—Without a Team of Experts,” a Presentation from Network Optix Read More +

Italian National Automobile Museum Exhibit Honors Legacy and Future of Autonomous Driving

This blog post was originally published at Ambarella’s website. It is reprinted here with the permission of Ambarella. Fifteen years ago, before we were acquired by Ambarella and became one of the company’s automotive R&D centers, VisLab sent four driverless vehicles from Parma to Shanghai. Traveling over 15,000 kilometers from July to October of 2010,

Italian National Automobile Museum Exhibit Honors Legacy and Future of Autonomous Driving Read More +

China’s Pivotal Role in Automotive Semiconductor Innovation

This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. Automotive White Paper, Vol.1, POWERED BY yole group – Drive the future KEY TAKEAWAYS Formula for automotive success: speed, cost, and semiconductor focus, including semiconductor supply and vertical integration. In almost all

China’s Pivotal Role in Automotive Semiconductor Innovation Read More +

How Embedded Vision Is Shaping the Next Generation of Autonomous Mobile Robots

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Autonomous Mobile Robots (AMRs) are being deployed across industries, from warehouses and hospitals to logistics and retail, thanks to embedded vision systems. See how cameras are integrated into AMRs so that they can quickly and

How Embedded Vision Is Shaping the Next Generation of Autonomous Mobile Robots Read More +

NAMUGA Unveils Stella-2: Compact, Solid-state Lidar Powered by Lumotive, at Embedded Vision Summit

REDMOND, Wash., May 21, 2025 /PRNewswire/ — Lumotive, a leader in programmable optical semiconductor technology, today announced that NAMUGA Co., Ltd., a leading manufacturer of advanced camera modules, will debut its first solid-state 3D lidar sensor—Stella-2, powered by Lumotive’s Light Control Metasurface (LCM) technology—at the upcoming Embedded Vision Summit in California. Stella-2 brings software-defined intelligence,

NAMUGA Unveils Stella-2: Compact, Solid-state Lidar Powered by Lumotive, at Embedded Vision Summit Read More +

e-con Systems Showcased New Lattice FPGA-based Holoscan Camera Solutions at Computex Taipei and Embedded Vision Summit 2025

California & Chennai (May 22, 2025): e-con Systems, a leading provider of camera solutions for embedded vision applications, successfully showcased its new Holoscan camera solution based on the low power Lattice FPGA technology for NVIDIA® platforms at two major industry events in 2025: Computex Taipei and the Embedded Vision Summit (EVS) in Santa Clara, CA,

e-con Systems Showcased New Lattice FPGA-based Holoscan Camera Solutions at Computex Taipei and Embedded Vision Summit 2025 Read More +

Key Drone Terminology: A Quick Guide for Beginners

This blog post was originally published at Namuga Vision Connectivity’s website. It is reprinted here with the permission of Namuga Vision Connectivity. As drone technology becomes more accessible and widespread, it’s important to get familiar with the basic terms that define how drones work and how we control them. Whether you’re a hobbyist, a content

Key Drone Terminology: A Quick Guide for Beginners Read More +

Human-like Robots and Flexible Movement

Humanoid and collaborative robots are two growing robotics technologies explored by IDTechEx in their portfolio of Robotics & Autonomy Research Reports. Senior Technology Analyst, Yulin Wang, will be speaking at the 2025 Humanoids Summit, covering details explored in depth in the IDTechEx report, “Humanoid Robots 2025-2035: Technologies, Markets and Opportunities“. The energy efficiency of humanoid

Human-like Robots and Flexible Movement Read More +

In-cabin Sensor Advancements: Radar or 3D Cameras?

Comparison of 3D ToF cameras and radars. In-cabin sensing technologies are reshaping the automotive landscape, driven by the need for enhanced safety, regulatory compliance, and personalized user experiences. In 2025, Tesla, Seeing Machines, and LG Electronics, introduced groundbreaking innovations in this space, leveraging radar, 3D cameras, and AI to redefine vehicle interiors. This article provides

In-cabin Sensor Advancements: Radar or 3D Cameras? Read More +

R²D²: Unlocking Robotic Assembly and Contact Rich Manipulation with NVIDIA Research

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. This edition of NVIDIA Robotics Research and Development Digest (R2D2) explores several contact-rich manipulation workflows for robotic assembly tasks from NVIDIA Research and how they can address key challenges with fixed automation, such as robustness, adaptability, and

R²D²: Unlocking Robotic Assembly and Contact Rich Manipulation with NVIDIA Research Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top