Brian Dipert

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs

Guy Ben Haim, Senior Product Manager, and Gururaj Rao, Field Applications Engineer, both of Synopsys, demonstrate the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Ben Haim and Rao demonstrate how to optimize neural network performance with the Synopsys ARC MetaWare MX Development Toolkit. Ben Haim and […]

Synopsys Demonstration of Smart Architectural Exploration for AI SoCs Read More +

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro

Taesu Kim, CTO of SqueezeBits, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Kim demonstrates a 2.4-billion-parameter large language model (LLM) running entirely on an iPhone 14 Pro without server connectivity. The device operates in airplane mode, highlighting on-device inference using a hybrid approach that

SqueezeBits Demonstration of On-device LLM Inference, Running a 2.4B Parameter Model on the iPhone 14 Pro Read More +

Synthetic Data for Computer Vision

This article was originally published at Synetic AI’s website. It is reprinted here with the permission of Synetic AI. Synthetic data is changing how computer vision models are being trained. This page will explain synthetic data and how it compares to traditional approaches. After exploring the main methods of creating synthetic data, we’ll help you

Synthetic Data for Computer Vision Read More +

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases

Zachary Li, Product and Business Development Manager at Sony America, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Li demonstrates his company’s AITRIOS products and ecosystem. Powered by the IMX500 intelligent vision sensor, Sony AITRIOS collaborates with Raspberry Pi for development kits and with leading

Sony Semiconductor Demonstration of AI Vision Devices and Tools for Industrial Use Cases Read More +

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor

JF Joly, Product Manager for the AITRIOS platform at Sony Semiconductor, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Joly demonstrates Sony’s fully open-source software stack that enables the creation of AI-powered cameras using the IMX500 intelligent vision sensor. In this demo, Joly illustrates how

Sony Semiconductor Demonstration of Its Open-source Edge AI Stack with the IMX500 Intelligent Sensor Read More +

First Sensor Module for High-resolution Image Sensor IMX811 (247 Mpixel) Available Soon

Embedded vision engineers and tech leads can start preparing for high-resolution applications. Munich – July 17th 2025 – The first on the market for engineers and tech leads in the embedded vision sector: FRAMOS will soon release the first image sensor module and development kit available on the market for Sony’s new high-resolution IMX811 image

First Sensor Module for High-resolution Image Sensor IMX811 (247 Mpixel) Available Soon Read More +

Edge AI Today: Real-world Use Cases for Developers

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Developers today face increasing pressure to deliver intelligent features with tighter timelines, constrained resources, and heightened expectations for privacy, performance, and accuracy. This article highlights real-world Edge AI applications already in production and mainstream use—providing actionable inspiration

Edge AI Today: Real-world Use Cases for Developers Read More +

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi

Amir Servi, Edge AI Product Manager at Sony Semiconductors, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Servi demonstrates the IMX500 — the first vision sensor with integrated edge AI processing capabilities. Using the Raspberry Pi AI Camera and Ultralytics YOLOv11n models, Servi showcases real-time

Sony Semiconductor Demonstration of On-sensor YOLO Inference with the Sony IMX500 and Raspberry Pi Read More +

Namuga Vision Connectivity Demonstration of Compact Solid-state LiDAR for Automotive and Robotics Applications

Min Lee, Business Development Team Leader at Namuga Vision Connectivity, demonstrates the company’s latest edge AI and vision technologies and products at the 2025 Embedded Vision Summit. Specifically, Lee demonstrates a compact solid-state LiDAR solution tailored for automotive and robotics industries. This solid-state LiDAR features high precision, fast response time, and no moving parts—ideal for

Namuga Vision Connectivity Demonstration of Compact Solid-state LiDAR for Automotive and Robotics Applications Read More +

SiMa.ai to Accelerate Edge AI Adoption with Cisco for Industry 4.0

SiMa.ai’s Modalix AI Platform Now Compatible With Cisco’s Industrial Ethernet Switches, Delivers Real-Time AI at the Industrial Edge San Jose, CA – July 16, 2025 – SiMa.ai, a leading provider of Machine Learning System on a Chip™ (MLSoC silicon and the Palette™ software platform, today announced a go-to-market collaboration with Cisco to bring artificial intelligence

SiMa.ai to Accelerate Edge AI Adoption with Cisco for Industry 4.0 Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top