fbpx

“Efficient Neuromorphic Computing with Dynamic Vision Sensor, Spiking Neural Network Accelerator and Hardware-aware Algorithms,” a Presentation from Arizona State University

Jae-sun Seo, Associate Professor at Arizona State University, presents the “Efficient Neuromorphic Computing with Dynamic Vision Sensor, Spiking Neural Network Accelerator and Hardware-aware Algorithms” tutorial at the May 2023 Embedded Vision Summit.

Spiking neural networks (SNNs) mimic biological nervous systems. Using event-driven computation and communication, SNNs achieve very low power consumption. However, two important issues have persisted. First, directly training SNNs has not yielded competitive inference accuracy. Second, non-spike inputs must be converted to spike trains, resulting in long latency. Recently, SNN algorithm accuracy has improved significantly, aided by new training techniques, and commercial event-based dynamic vision sensors (DVSs) have emerged.

Integrating a spike-based DVS with an SNN accelerator is a promising approach for end-to-end, event-driven operations. We also need accurate and hardware-aware SNN algorithms that can be directly trained with input spikes from a DVS while reducing storage and compute requirements. In this talk, Seo introduces the characteristics, opportunities and challenges of SNNs, and presents results from projects utilizing neuromorphic algorithms and custom hardware.

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top