Processors

Nextchip Demonstration of Various Computing Applications Using the Company’s ADAS SoC

Jonathan Lee, Manager of the Global Strategy Team at Nextchip, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates various computing applications using his company’s ADAS SoC. Lee showcases how Nextchip’s ADAS SoC can be used for applications such as: Sensor fusion of iToF […]

Nextchip Demonstration of Various Computing Applications Using the Company’s ADAS SoC Read More +

Microchip Technology Demonstration of Real-time Object and Facial Recognition with Edge AI Platforms

Swapna Guramani, Applications Engineer for Microchip Technology, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Guramani demonstrates her company’s latest AI/ML capabilities in action: real-time object recognition using the SAMA7G54 32-bit MPU running Edge Impulse’s FOMO model, and facial recognition powered by TensorFlow Lite’s Mobile

Microchip Technology Demonstration of Real-time Object and Facial Recognition with Edge AI Platforms Read More +

Microchip Technology Demonstration of AI-powered Face ID on the Polarfire SoC FPGA Using the Vectorblox SDK

Avery Williams, Channel Marketing Manager for Microchip Technology, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Williams demonstrates ultra-efficient AI-powered facial recognition on Microchip’s PolarFire SoC FPGA using the VectorBlox Accelerator SDK. Pre-trained neural networks are quantized to INT8 and compiled to run directly on

Microchip Technology Demonstration of AI-powered Face ID on the Polarfire SoC FPGA Using the Vectorblox SDK Read More +

Chips&Media Demonstration of Its WAVE-N NPU In High-quality, High-resolution Imaging Applications

Andy Lee, Vice President of U.S. Marketing at Chips&Media, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates example high-quality, high-resolution imaging applications in edge devices implemented using WAVE-N, his company’s custom NPU. Key notes: Extreme efficiency, up to 90% of MAC utilization, for

Chips&Media Demonstration of Its WAVE-N NPU In High-quality, High-resolution Imaging Applications Read More +

How to Think About Large Language Models on the Edge

This blog post was originally published at BrainChip’s website. It is reprinted here with the permission of BrainChip. ChatGPT was released to the public on November 30th, 2022, and the world – at least, the connected world – has not been the same since. Surprisingly, almost three years later, despite massive adoption, we do not

How to Think About Large Language Models on the Edge Read More +

Chips&Media Introduction to Its WAVE-N Specialized Video Processing NPU IP

Andy Lee, Vice President of U.S. Marketing at Chips&Media, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee introduces WAVE-N, his company’s specialized video processing NPU IP. Key notes: Extreme efficiency, up to 90% of MAC utilization, for modern CNN computation Highly optimized for real-time

Chips&Media Introduction to Its WAVE-N Specialized Video Processing NPU IP Read More +

One Year of Qualcomm AI Hub: Enabling Developers and Driving the Future of AI

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. The past year has been an incredible journey for Qualcomm AI Hub. We’ve seen remarkable growth, innovation and momentum — and we’re only getting started. Qualcomm AI Hub has become a key resource for developers looking to

One Year of Qualcomm AI Hub: Enabling Developers and Driving the Future of AI Read More +

VeriSilicon Demonstration of a Partner Application In the iEVCam

Halim Theny, VP of Product Engineering at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Theny demonstrates a customer’s SoC, featuring a collaboration with a premier camera vendor using an event-based sensor to detect motion, and processed by VeriSilicon’s NPU and vision DSP. This

VeriSilicon Demonstration of a Partner Application In the iEVCam Read More +

Embedded Vision Kit from Vision Components and Phytec: Plug-and-play with i.MX 8M Plus / 8M Mini and 50+ MIPI Cameras

Mainz / Ettlingen, July 23, 2025 – The phyBOARD for VC MIPI Development Kit from Phytec, in collaboration with Vision Components, is now available in two versions with NXP i.MX 8M Plus and i.MX 8M Mini processor boards. Both versions provide plug-and-play support for Vision Components’ more than 50 VC MIPI Cameras. The two embedded vision

Embedded Vision Kit from Vision Components and Phytec: Plug-and-play with i.MX 8M Plus / 8M Mini and 50+ MIPI Cameras Read More +

VeriSilicon Demonstration of the Open Se Cura Project

Chris Wang, VP of Multimedia Technologies and a member of CTO office at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Wang demonstrates examples from the Open Se Cura Project, a joint effort between VeriSilicon and Google. The project showcases a scalable, power-efficient, and

VeriSilicon Demonstration of the Open Se Cura Project Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top