Object Tracking

Object Tracking Functions

Microchip Technology Demonstration of Real-time Object and Facial Recognition with Edge AI Platforms

Swapna Guramani, Applications Engineer for Microchip Technology, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Guramani demonstrates her company’s latest AI/ML capabilities in action: real-time object recognition using the SAMA7G54 32-bit MPU running Edge Impulse’s FOMO model, and facial recognition powered by TensorFlow Lite’s Mobile

Read More »

VeriSilicon Demonstration of a Partner Application In the iEVCam

Halim Theny, VP of Product Engineering at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Theny demonstrates a customer’s SoC, featuring a collaboration with a premier camera vendor using an event-based sensor to detect motion, and processed by VeriSilicon’s NPU and vision DSP. This

Read More »

Considerate Cars: Making Calls for Coffee and Keeping Drivers Alert

Caffeine ready to collect as the car decides to pull over to charge could become the normality of the future, as software-defined vehicle technology and the presence of AI within vehicles advances. IDTechEx’s portfolios of Robotics & Autonomy and Semiconductors, Computing & AI research reports cover passenger safety and increased comfort, while the research onElectric

Read More »

How Do Surround-view Cameras Improve Driving and Parking Safety?

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. As vehicles become more complex, their need for accurate imaging has increased. This has driven the adoption of surround-view cameras. They give drivers a complete, real-time, 360-degree view of the vehicle, thereby improving situational awareness

Read More »

Synopsys Demonstration of Siengine’s AD1000 ADAS Chip, Powered by Synopsys NPX6 NPU IP

Gordon Cooper, Principal Product Manager at Synopsys, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Cooper demonstrates the powerful SiEngine AD1000 NPU and the robust toolchain including debugger, profiler, and simulator, which features Synopsys NPX6 NPU IP. Learn how the platform supports TensorFlow, ONNX, and

Read More »

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi

Amir Servi, Edge AI Product Manager at Sony Semiconductors, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Servi demonstrates the IMX500 — the first vision sensor with integrated edge AI processing capabilities. Using the Raspberry Pi AI Camera and Ultralytics YOLOv11n models, Servi showcases real-time

Read More »

Namuga Vision Connectivity Demonstration of Compact Solid-state LiDAR for Automotive and Robotics Applications

Min Lee, Business Development Team Leader at Namuga Vision Connectivity, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates a compact solid-state LiDAR solution tailored for automotive and robotics industries. This solid-state LiDAR features high precision, fast response time, and no moving parts—ideal for

Read More »

Namuga Vision Connectivity Demonstration of an AI-powered Total Camera System for an Automotive Bus Solution

Min Lee, Business Development Team Leader at Namuga Vision Connectivity, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates his company’s AI-powered total camera system. The system is designed for integration into public transportation, especially buses, enhancing safety and automation. It includes front-view, side-view,

Read More »

Namuga Vision Connectivity Demonstration of a Real-time Eye-tracking Camera Solution with a Glasses-free 3D Display

Min Lee, Business Development Team Leader at Namuga Vision Connectivity, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates a real-time eye-tracking camera solution that accurately detects the viewer’s eye position and angle. This data enables a glasses-free 3D display experience using an advanced

Read More »

Teledyne FLIR Demonstration of an Advanced Thermal Imaging Camera Enabling Automotive Safety Improvements

Ethan Franz, Senior Software Engineer at Teledyne FLIR, demonstrates the company’s latest edge AI and vision technologies and products in Lattice Semiconductor’s booth at the 2025 Embedded Vision Summit. Specifically, Franz demonstrates a state-of-the-art thermal imaging camera for automotive safety applications, designed using Lattice FPGAs. This next-generation camera, also incorporating Teledyne FLIR’s advanced sensing technology,

Read More »

Lattice Semiconductor Demonstration of Random Bin Picking Based on Structured-Light 3D Scanning

Mark Hoopes, Senior Director of Industrial and Automotive at Lattice Semiconductor, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Hoopes demonstrates how Lattice FPGAs increase performance, reduce latency and jitter, and reduce overall power for a random bin picking robot. The Lattice FPGA offloads the

Read More »

Stereo ace for Precise 3D Images Even with Challenging Surfaces

The new high-resolution Basler Stereo ace complements Basler’s 3D product range with an easy-to-integrate series of active stereo cameras that are particularly suitable for logistics and factory automation. Ahrensburg, July 10, 2025 – Basler AG introduces the new active 3D stereo camera series Basler Stereo ace consisting of 6 camera models and thus strengthens its position as

Read More »

Cadence Demonstration of Waveguide 4D Radar Central Computing on a Tensilica Vision DSP-based Platform

Sriram Kalluri, Product Marketing Manager for Cadence Tensilica DSPs, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Kalluri demonstrates the use of the Tensilica Vision 130 (P6) DSP for advanced 4D radar computing for perception sensing used in ADAS applications. The Vision 130 DSP is

Read More »

Achieving High-speed Automatic Emergency Braking with AI-driven 4D Imaging Radar

This blog post was originally published at Ambarella’s website. It is reprinted here with the permission of Ambarella. Across the globe, regulators are accelerating efforts to make roads safer through the widespread adoption of Automatic Emergency Braking (AEB). In the United States, the National Highway Traffic Safety Administration (NHTSA) implemented a sweeping regulation that requires

Read More »

Network Optix Demonstration of How the Company is Powering Scalable Data-driven Video Infrastructure

Tagir Gadelshin, Director of Product at Network Optix, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Gadelshin demonstrates how the company’s latest release, Gen 6 Enterprise, is enabling cloud-powered, event-driven video infrastructure for enterprise organizations at scale. Built on Nx EVOS, Gen 6 Enterprise supports

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top