“An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs,” a Presentation from Xperi

Petronel Bigioi, CTO for Imaging at Xperi, presents the "An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs" tutorial at the May 2019 Embedded Vision Summit.

Neural network encryption is a useful method to secure a company’s IP. This presentation focuses on the design details of an ultra-low-power, scalable neural network core capable of performing inference on encrypted neural networks. Decryption of the neural network weights and topology take place inside the core, avoiding the need for decrypted networks to be present at any time in main memory. Bigioi also discusses solutions clustering together multiple neural network cores to meet the neural inference processing requirements of a target SoC platform.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.



1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone: +1 (925) 954-1411
Scroll to Top