“From Feature Engineering to Network Engineering,” a Presentation from ShatterLine Labs and AMD

Auro Tripathy, Founding Principal at ShatterLine Labs (representing AMD), presents the “From Feature Engineering to Network Engineering” tutorial at the May 2018 Embedded Vision Summit.

The availability of large labeled image datasets is tilting the balance in favor of “network engineering”instead of “feature engineering”. Hand-designed features dominated recognition tasks in the past, but now features can be automatically learned by back-propagating errors through the layers of a hierarchical “network” of feature maps. As a result, we’re seeing a plethora of network topologies that satisfy design objectives such as reduced parameter count, lower compute complexity, and faster learning. Certain core network building-blocks have emerged, such as split-transform-merge (as in the Inception Module), skipping layers (as in Resnet, Densenet and its variants), weight-sharing across two independent networks for similarity learning (as in a Siamese Network), and encoder/decoder network topologies for segmentation (as in Unet/Linknet).

In this session, Tripathy highlights these topologies from a feature representation and a feature learning angle, and shows how they are succinctly implemented in the Keras high-level Deep Learning framework (with 7 to 10 lines of Python code snippets). He also pays close attention to model size and compute complexity of the topologies. In the interest of keeping the subject matter focused, however, he does not explore the temporal aspects of visual understanding such as recurrent network topologies.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top