APPLICATIONS

Untether AI Demonstration of Video Analysis Using the runAI Family of Inference Accelerators

Max Sbabo, Senior Application Engineer at Untether AI, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Sbabo demonstrates his company’s its AI inference technology with AI accelerator cards that leverage the capabilities of the runAI family of ICs in a PCI-Express form factor. This demonstration […]

Untether AI Demonstration of Video Analysis Using the runAI Family of Inference Accelerators Read More +

The Role of AI-driven Embedded Vision Cameras in Self-checkout Loss Prevention

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Self-checkout usage is rapidly growing and redefining retail experiences. This shift has led to retail losses that can only be overcome by AI-based embedded vision. Explore the types of retail shrinkage, how AI helps, and

The Role of AI-driven Embedded Vision Cameras in Self-checkout Loss Prevention Read More +

Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor

Shay Harel, field application engineer at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Harel demonstrates the capabilities of his company’s M4.51 sensor module using a simple Python script that leverages Inuitive’s API for real-time object detection. The M4.51 sensor module, based on the

Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor Read More +

Avnet Demonstration of an AI-driven Smart Parking Lot Monitoring System Using the RZBoard V2L

Monica Houston, AI Manager of the Advanced Applications Group at Avnet, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Houston demonstrates a smart city application based on her company’s RZBoard single-board computer. Using embedded vision and combination of edge AI and cloud connectivity, the demo

Avnet Demonstration of an AI-driven Smart Parking Lot Monitoring System Using the RZBoard V2L Read More +

Advantech Demonstration of AI Vision with an Edge AI Camera and Deep Learning Software

Brian Lin, Field Sales Engineer at Advantech, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Lin demonstrates his company’s edge AI vision solution embedded with NVIDIA Jetson platforms. Lin demonstrates how Advantech’s industrial cameras, equipped with Overview’s deep-learning software, effortlessly capture even the tiniest defects

Advantech Demonstration of AI Vision with an Edge AI Camera and Deep Learning Software Read More +

Analog Devices Demonstration of the MAX78000 AI Microcontroller Performing Action Recognition

Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates the MAX78000 AI microcontroller performing action recognition using a temporal convolutional network (TCN). Using a TCN-based model, the MAX78000 accurately recognizes a

Analog Devices Demonstration of the MAX78000 AI Microcontroller Performing Action Recognition Read More +

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm

Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates visual servoing in a robotic arm enabled by the MAX78000 AI microcontroller. The MAX78000 is an Arm-M4F microcontroller with a hardware-based convolutional

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm Read More +

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor

Dor Zepeniuk, CTO at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Zepeniuk demonstrates his company’s latest RGBD sensor, which integrates RGB color sensor with a depth sensor into a single device. The Inuitive NU4100 is an all-in-one vision processor that supports simultaneous AI-powered

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor Read More +

Accelerating Transformer Neural Networks for Autonomous Driving

This blog post was originally published at Ambarella’s website. It is reprinted here with the permission of Ambarella. Autonomous driving (AD) and advanced driver assistance system (ADAS) providers are deploying more and more AI neural networks (NNs) to offer human-like driving experience. Several of the leading AD innovators have either deployed, or have a roadmap

Accelerating Transformer Neural Networks for Autonomous Driving Read More +

Sensor Cortek Demonstration of SmarterRoad Running on Synopsys ARC NPX6 NPU IP

Fahed Hassanhat, head of engineering at Sensor Cortek, demonstrates the company’s latest edge AI and vision technologies and products in Synopsys’ booth at the 2024 Embedded Vision Summit. Specifically, Hassanhat demonstrates his company’s latest ADAS neural network (NN) model, SmarterRoad, combining lane detection and open space detection. SmarterRoad is a light integrated convolutional network that

Sensor Cortek Demonstration of SmarterRoad Running on Synopsys ARC NPX6 NPU IP Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top