fbpx

“Ergo: Perceive’s Chip – Data Center-Class Inference in Edge Devices at Ultra-Low Power,” a Presentation from Perceive

Steve Teig, CEO of Perceive, presents the “Ergo: Perceive’s Chip – Data Center-Class Inference in Edge Devices at Ultra-Low Power” tutorial at the September 2020 Embedded Vision Summit.

To date, people seeking to deploy machine learning-based inference within consumer electronics have had only two choices, both unattractive. The first option entails transmitting voluminous raw data, such as video, to the cloud, potentially violating customers’ privacy, tempting hackers, and costing substantial energy, money, and latency. The second option runs at the edge, but on severely limited hardware, which can implement only tiny, inaccurate neural networks (e.g., MobileNet) and runs even those tiny networks at low frame rates.

Solving this dilemma, Perceive’s new chip, Ergo, runs large, advanced neural networks at high speed for imaging, audio, language, and other applications inside edge devices without any off-chip RAM. Even large networks, such as YOLOv3 with more than 64 million weights, can run at ~250 fps (with batch size 1). Moreover, Ergo can run YOLOv3 at 30 fps in about 20 mW (i.e., more than 50x more power-efficiently than competing devices).

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top