“FOMO: Real-Time Object Detection on Microcontrollers,” a Presentation from Edge Impulse

Jan Jongboom, Co-founder and CTO of Edge Impulse, delivers the “FOMO: Real-Time Object Detection on Microcontrollers” tutorial at the May 2022 Embedded Vision Summit.

Object detection models are vital for many computer vision applications. They can show where an object is in a video stream, or how many objects are present. But they’re also very resource-intensive—models like MobileNet SSD can analyze a few frames per second on a Raspberry Pi 4, using a significant amount of RAM. This has put object detection out of reach for the most interesting devices: microcontrollers.

Microcontrollers are cheap, small, ubiquitous and energy efficient—and are thus attractive for adding computer vision to everyday devices. But microcontrollers are also very resource-constrained, with clock speeds of up to 200 MHz and less than 256 Kbytes of RAM—far too little to run complex object detection models. But… that’s about to change! In this talk, Jongboom outlines his company’s work on FOMO (“faster objects, more objects”), a novel DNN architecture for object detection, designed from the ground up to run on microcontrollers.

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top