“How to Enhance Edge AI Vision with the Katana SoC Using Multi-Modal Sensing,” a Presentation from Synaptics

Shay Kamin Braun, Director of Low-power AI Marketing at Synaptics, presents the “How to Enhance Edge AI Vision with the Katana SoC Using Multi-Modal Sensing” tutorial at the May 2022 Embedded Vision Summit.

Machine learning (ML)-based vision edge AI has wide applicability across a variety of segments, including consumer electronics, home security, smart buildings, smart city and factory automation. To date, most vision edge AI implementations have focused solely on vision–detecting people, objects and activities. Moreover, implementations have suffered from high power consumption, typically requiring AC power. These two factors have limited the penetration of vision edge AI.

In this presentation, Braun describes a modern approach based around the Katana low-power edge AI SoC that improves performance and optimizes power consumption by fusing together inputs from a variety of sensors (including vision, sound and environmental, among others) into an AI processor running multiple ML models in parallel. He shows how this approach enables the design of more intelligent, context-aware, battery-powered edge AI inference devices, significantly broadening the usefulness and penetration of vision edge AI across multiple markets and new applications.

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top