PROVIDER

Synopsys and Visionary.ai Demonstration of a Low-light Real-time AI Video Denoiser Tailored for NPX6 NPU IP

Gordon Cooper, Principal Product Manager at Synopsys, and David Jarmon, Senior VP of Worldwide Sales at Visionary.ai, demonstrates the companies’ latest edge AI and vision technologies and products in Synopsys’ booth at the 2025 Embedded Vision Summit. Specifically, Cooper and Jarmon demonstrate the future of low-light imaging with Visionary.ai’s cutting-edge real-time AI video denoiser. This […]

Synopsys and Visionary.ai Demonstration of a Low-light Real-time AI Video Denoiser Tailored for NPX6 NPU IP Read More +

Generative AI at the Core of a $372 Billion Data Center Processor Revolution

This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. In its 2025 report on Generative AI – Computing & AI for Data Centers, Yole Group’s analysts offer deep insight into the strategic shift toward specialized AI infrastructure KEY TAKEAWAYS Global data

Generative AI at the Core of a $372 Billion Data Center Processor Revolution Read More +

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs

Guy Ben Haim, Senior Product Manager, and Gururaj Rao, Field Applications Engineer, both of Synopsys, demonstrate the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Ben Haim and Rao demonstrate how to optimize neural network performance with the Synopsys ARC MetaWare MX Development Toolkit. Ben Haim and

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs Read More +

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro

Taesu Kim, CTO of SqueezeBits, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Kim demonstrates a 2.4-billion-parameter large language model (LLM) running entirely on an iPhone 14 Pro without server connectivity. The device operates in airplane mode, highlighting on-device inference using a hybrid approach that

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro Read More +

Synthetic Data for Computer Vision

This article was originally published at Synetic AI’s website. It is reprinted here with the permission of Synetic AI. Synthetic data is changing how computer vision models are being trained. This page will explain synthetic data and how it compares to traditional approaches. After exploring the main methods of creating synthetic data, we’ll help you

Synthetic Data for Computer Vision Read More +

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases

Zachary Li, Product and Business Development Manager at Sony America, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Li demonstrates his company’s AITRIOS products and ecosystem. Powered by the IMX500 intelligent vision sensor, Sony AITRIOS collaborates with Raspberry Pi for development kits and with leading

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases Read More +

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor

JF Joly, Product Manager for the AITRIOS platform at Sony Semiconductor, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Joly demonstrates Sony’s fully open-source software stack that enables the creation of AI-powered cameras using the IMX500 intelligent vision sensor. In this demo, Joly illustrates how

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor Read More +

First Sensor Module for High-resolution Image Sensor IMX811 (247 Mpixel) Available Soon

Embedded vision engineers and tech leads can start preparing for high-resolution applications. Munich – July 17th 2025 – The first on the market for engineers and tech leads in the embedded vision sector: FRAMOS will soon release the first image sensor module and development kit available on the market for Sony’s new high-resolution IMX811 image

First Sensor Module for High-resolution Image Sensor IMX811 (247 Mpixel) Available Soon Read More +

Edge AI Today: Real-world Use Cases for Developers

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Developers today face increasing pressure to deliver intelligent features with tighter timelines, constrained resources, and heightened expectations for privacy, performance, and accuracy. This article highlights real-world Edge AI applications already in production and mainstream use—providing actionable inspiration

Edge AI Today: Real-world Use Cases for Developers Read More +

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi

Amir Servi, Edge AI Product Manager at Sony Semiconductors, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Servi demonstrates the IMX500 — the first vision sensor with integrated edge AI processing capabilities. Using the Raspberry Pi AI Camera and Ultralytics YOLOv11n models, Servi showcases real-time

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top