Summit 2025

VeriSilicon Demonstration of a Partner Application In the iEVCam

Halim Theny, VP of Product Engineering at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Theny demonstrates a customer’s SoC, featuring a collaboration with a premier camera vendor using an event-based sensor to detect motion, and processed by VeriSilicon’s NPU and vision DSP. This […]

VeriSilicon Demonstration of a Partner Application In the iEVCam Read More +

VeriSilicon Demonstration of the Open Se Cura Project

Chris Wang, VP of Multimedia Technologies and a member of CTO office at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Wang demonstrates examples from the Open Se Cura Project, a joint effort between VeriSilicon and Google. The project showcases a scalable, power-efficient, and

VeriSilicon Demonstration of the Open Se Cura Project Read More +

VeriSilicon Demonstration of Advanced AR/VR Glasses Solutions

Dr. Mahadev Kolluru, Senior VP of North America and India Business at VeriSilicon, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Dr. Kolluru demonstrates products developed using the company’s IP licensing and turnkey silicon design services. Dr. Kolluru highlights how VeriSilicon supports cutting-edge AI and

VeriSilicon Demonstration of Advanced AR/VR Glasses Solutions Read More +

Synopsys Demonstration of Siengine’s AD1000 ADAS Chip, Powered by Synopsys NPX6 NPU IP

Gordon Cooper, Principal Product Manager at Synopsys, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Cooper demonstrates the powerful SiEngine AD1000 NPU and the robust toolchain including debugger, profiler, and simulator, which features Synopsys NPX6 NPU IP. Learn how the platform supports TensorFlow, ONNX, and

Synopsys Demonstration of Siengine’s AD1000 ADAS Chip, Powered by Synopsys NPX6 NPU IP Read More +

Synopsys and Visionary.ai Demonstration of a Low-light Real-time AI Video Denoiser Tailored for NPX6 NPU IP

Gordon Cooper, Principal Product Manager at Synopsys, and David Jarmon, Senior VP of Worldwide Sales at Visionary.ai, demonstrates the companies’ latest edge AI and vision technologies and products in Synopsys’ booth at the 2025 Embedded Vision Summit. Specifically, Cooper and Jarmon demonstrate the future of low-light imaging with Visionary.ai’s cutting-edge real-time AI video denoiser. This

Synopsys and Visionary.ai Demonstration of a Low-light Real-time AI Video Denoiser Tailored for NPX6 NPU IP Read More +

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs

Guy Ben Haim, Senior Product Manager, and Gururaj Rao, Field Applications Engineer, both of Synopsys, demonstrate the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Ben Haim and Rao demonstrate how to optimize neural network performance with the Synopsys ARC MetaWare MX Development Toolkit. Ben Haim and

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs Read More +

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro

Taesu Kim, CTO of SqueezeBits, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Kim demonstrates a 2.4-billion-parameter large language model (LLM) running entirely on an iPhone 14 Pro without server connectivity. The device operates in airplane mode, highlighting on-device inference using a hybrid approach that

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro Read More +

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases

Zachary Li, Product and Business Development Manager at Sony America, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Li demonstrates his company’s AITRIOS products and ecosystem. Powered by the IMX500 intelligent vision sensor, Sony AITRIOS collaborates with Raspberry Pi for development kits and with leading

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases Read More +

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor

JF Joly, Product Manager for the AITRIOS platform at Sony Semiconductor, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Joly demonstrates Sony’s fully open-source software stack that enables the creation of AI-powered cameras using the IMX500 intelligent vision sensor. In this demo, Joly illustrates how

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor Read More +

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi

Amir Servi, Edge AI Product Manager at Sony Semiconductors, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Servi demonstrates the IMX500 — the first vision sensor with integrated edge AI processing capabilities. Using the Raspberry Pi AI Camera and Ultralytics YOLOv11n models, Servi showcases real-time

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top