fbpx

Processors

“Hardware-aware Deep Neural Network Design,” a Presentation from Facebook

Peter Vajda, Research Manager at Facebook, presents the “Hardware-aware Deep Neural Network Design” tutorial at the May 2019 Embedded Vision Summit. A central problem in the deployment of deep neural networks is maximizing accuracy within the compute performance constraints of embedded devices. In this talk, Vajda discusses approaches to addressing this challenge based on automated […]

“Hardware-aware Deep Neural Network Design,” a Presentation from Facebook Read More +

“Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling,” a Presentation from Mythic

Mike Henry, CEO and Founder of Mythic, presents the “Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling” tutorial at the May 2019 Embedded Vision Summit. AI inference at the edge will continue to create insatiable demand for compute performance in power- and cost-constrained form factors. Taking into account past trends,

“Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling,” a Presentation from Mythic Read More +

“The Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability,” a Presentation from Xilinx

Nick Ni, Director of Product Marketing at Xilinx, presents the “Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability” tutorial at the May 2019 Embedded Vision Summit. AI inference demands orders- of-magnitude more compute capacity than what today’s SoCs offer. At the same time, neural network topologies are changing too quickly to be addressed by

“The Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability,” a Presentation from Xilinx Read More +

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique

Ben Bodley, CEO of Teknique, presents the “Designing Your Next Vision Product Using a Systems Approach,” tutorial at the May 2019 Embedded Vision Summit. Today it’s easier than ever to create a credible demo of a new smart camera product for a specific application. But the distance from a demo to a robust product is

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique Read More +

“Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs,” a Presentation from Qualcomm

Felix Baum, Director of Product Management for AI Software at Qualcomm, presents the “Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs” tutorial at the May 2019 Embedded Vision Summit. Increasingly, machine learning models are being deployed at the edge, and these models are getting bigger. As a result, we are hitting

“Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs,” a Presentation from Qualcomm Read More +

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies

Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, presents the “Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit” tutorial at the May 2019 Embedded Vision Summit. In this presentation, Taylor describes methods and tools for developing, profiling and optimizing neural network solutions for deployment on Arm MCUs, CPUs and

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies Read More +

“Object Detection for Embedded Markets,” a Presentation from Imagination Technologies

Paul Brasnett, PowerVR Business Development Director for Vision and AI at Imagination Technologies, presents the “Object Detection for Embedded Markets” tutorial at the May 2019 Embedded Vision Summit. While image classification was the breakthrough use case for deep learning-based computer vision, today it has a limited number of real-world applications. In contrast, object detection is

“Object Detection for Embedded Markets,” a Presentation from Imagination Technologies Read More +

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp

Oleg Sinyavskiy, Director of Research and Development at Brain Corp, presents the “Sensory Fusion for Scalable Indoor Navigation” tutorial at the May 2019 Embedded Vision Summit. Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp Read More +

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool

Sugosh Venkataraman, Vice President of Technology at Whirlpool, presents the “Enabling the Next Kitchen Experience Through Embedded Vision,” tutorial at the May 2019 Embedded Vision Summit. Our kitchens are the hubs where we spend quality time with family and friends, preparing and eating meals. Today, instructions for cooking a particular meal are just a few

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top