Videos

“How to Choose a 3D Vision Technology,” a Presentation from Carnegie Robotics

Chris Osterwood, Chief Technical Officer at Carnegie Robotics, presents the "How to Choose a 3D Vision Technology" tutorial at the May 2017 Embedded Vision Summit. Designers of autonomous vehicles, robots, and many other systems are faced with a critical challenge: Which 3D perception technology to use? There are a wide variety of sensors on the

“How to Choose a 3D Vision Technology,” a Presentation from Carnegie Robotics Read More +

ARM Demonstration of the Spirit Object Detection and Localization Accelerator

Tim Hartley, Senior Product Manager at ARM, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Hartley demonstrates the company's Spirit computer vision accelerator, acquired when ARM bought Apical in mid-2016. The demo uses two cameras feeding into individual Spirit object detection and localization hardware accelerators. Each

ARM Demonstration of the Spirit Object Detection and Localization Accelerator Read More +

ARM Demonstration of Collaboratively Optimizing Deep Learning Applications

Gian Marco Iodice, Software Engineer at ARM, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Iodice presents a demonstration of Collective Knowledge, developed by partner company Dividiti. The demo supports several popular models for object classification and detection, implemented in Caffe or TensorFlow, and running on

ARM Demonstration of Collaboratively Optimizing Deep Learning Applications Read More +

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation

Petronel Bigioi, General Manager at FotoNation, presents the "Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography" tutorial at the May 2017 Embedded Vision Summit. This talk focuses on bringing intelligence to the edge to enable local devices to see and hear. It explores the power-consumption-vs.-flexibility dilemma by examining hard-coded and programmable architectures. It

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation Read More +

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research

Mark Bünger, VP of Research at Lux Research, presents the "Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry" tutorial at the May 2017 Embedded Vision Summit. The auto and telecom industries have been dreaming of connected cars for twenty years, but their results have been mediocre and mixed. Now, just

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research Read More +

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics

Jessica Gehlhar, Vision Solutions Engineer at Edmund Optics, presents the “Introduction to Optics for Embedded Vision” tutorial at the May 2017 Embedded Vision Summit. This talk provides an introduction to optics for embedded vision system and algorithm developers. Gehlhar begins by presenting fundamental imaging lens specifications and quality metrics. She explains key parameters and concepts

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics Read More +

ARM Demonstration of Luxoft Photographic Occlusion Removal

Gian Marco Iodice, Software Engineer at ARM, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Iodice presents a demonstration developed by partner company Luxoft. The demo showcases Luxoft’s new computational photography technique called Occlusion (obstruction) Removal, running in near-real time on an ARM-based smartphone platform.

ARM Demonstration of Luxoft Photographic Occlusion Removal Read More +

ARM Demonstration of Au-Zone Distracted Driving

Gian Marco Iodice, Software Engineer at ARM, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Iodice presents a demonstration developed by partner company Au-Zone Technologies. Using a deep neural network running on a standard embedded ARM processor (the Rockchip RK-3288), the demonstration classifies driver activity in

ARM Demonstration of Au-Zone Distracted Driving Read More +

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners

Rudy Burger, Managing Partner at Woodside Capital Partners, presents the "What’s Hot? The M&A and Funding Landscape for Computer Vision Companies" tutorial at the May 2017 Embedded Vision Summit. The six primary markets driving computer vision are automotive, sports and entertainment, consumer and mobile, robotics and machine vision, medical, and security and surveillance. This presentation

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top