Videos on Edge AI and Visual Intelligence
We hope that the compelling AI and visual intelligence case studies that follow will both entertain and inspire you, and that you’ll regularly revisit this page as new material is added. For more, monitor the News page, where you’ll frequently find video content embedded within the daily writeups.
Alliance Website Videos

Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor
Shay Harel, field application engineer at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Harel demonstrates the capabilities of his company’s M4.51 sensor module using a simple Python script that leverages Inuitive’s API for real-time object detection. The M4.51 sensor module, based on the

Gigantor Technologies Demonstration of Removing Resource Contention for Real-time Object Detection
Jessica Jones, Vice President and Chief Marketing Officer at Gigantor Technologies, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Jones demonstrates her company’s GigaMAACS’ Synthetic Scaler with a live facial tracking demo that enables real-time, unlimited objection detection at all ranges while only requiring training

Avnet Demonstration of High-performance, Power-efficient Vision AI Applications with the Qualcomm QCS6490 SoC
Peter Fenn, Director of the Advanced Applications Group at Avnet, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Fenn demonstrates the new QCS6940 SMARC SOM and Vision AI Development Kit from Avnet Embedded. An example vision AI application, including two cameras and four simultaneous vision

Avnet Demonstration of an AI-driven Smart Parking Lot Monitoring System Using the RZBoard V2L
Monica Houston, AI Manager of the Advanced Applications Group at Avnet, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Houston demonstrates a smart city application based on her company’s RZBoard single-board computer. Using embedded vision and combination of edge AI and cloud connectivity, the demo

Advantech Demonstration of AI Vision with an Edge AI Camera and Deep Learning Software
Brian Lin, Field Sales Engineer at Advantech, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Lin demonstrates his company’s edge AI vision solution embedded with NVIDIA Jetson platforms. Lin demonstrates how Advantech’s industrial cameras, equipped with Overview’s deep-learning software, effortlessly capture even the tiniest defects

MediaTek Demonstrations of Its Genio IoT Development Platform and Partners’ Products
Sunil Chhugani, Director of Business Development at Mediatek, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Shhugani demonstrates his company’s Genio IoT Development Platform, as well as partners’ products based on MediaTek SoCs. Specific demos include: Dual Display on Ezurio Tungsten700 SMARC platform: Qt application

Analog Devices Demonstration of the MAX78000 AI Microcontroller Performing Action Recognition
Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates the MAX78000 AI microcontroller performing action recognition using a temporal convolutional network (TCN). Using a TCN-based model, the MAX78000 accurately recognizes a

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm
Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates visual servoing in a robotic arm enabled by the MAX78000 AI microcontroller. The MAX78000 is an Arm-M4F microcontroller with a hardware-based convolutional

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor
Dor Zepeniuk, CTO at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Zepeniuk demonstrates his company’s latest RGBD sensor, which integrates RGB color sensor with a depth sensor into a single device. The Inuitive NU4100 is an all-in-one vision processor that supports simultaneous AI-powered

Sensor Cortek Demonstration of SmarterRoad Running on Synopsys ARC NPX6 NPU IP
Fahed Hassanhat, head of engineering at Sensor Cortek, demonstrates the company’s latest edge AI and vision technologies and products in Synopsys’ booth at the 2024 Embedded Vision Summit. Specifically, Hassanhat demonstrates his company’s latest ADAS neural network (NN) model, SmarterRoad, combining lane detection and open space detection. SmarterRoad is a light integrated convolutional network that