Automotive

May 2014 Embedded Vision Alliance Member Meeting Presentation: “Cameras In, On, and Around the Car: Understanding and Sizing Marketing Opportunities,” Roger Lanctot, Strategy Analytics

Roger Lanctot, Associate Director of the Automotive Practice at Strategy Analytics, delivers the presentation "Cameras In, On, and Around the Car: Understanding and Sizing Marketing Opportunities" at the May 2014 Embedded Vision Alliance Member Meeting.

May 2014 Embedded Vision Alliance Member Meeting Presentation: “Cameras In, On, and Around the Car: Understanding and Sizing Marketing Opportunities,” Roger Lanctot, Strategy Analytics Read More +

“Self-Driving Cars,” an Embedded Vision Summit Keynote Presentation from Google

Nathaniel Fairfield, Technical Lead at Google, presents the "Self-Driving Cars" keynote at the May 2014 Embedded Vision Summit. Self-driving cars have the potential to transform how we move: they promise to make us safer, give freedom to millions of people who can't drive, and give people back their time. The Google Self-Driving Car project was

“Self-Driving Cars,” an Embedded Vision Summit Keynote Presentation from Google Read More +

GPUTech

Real-Time Traffic Sign Recognition on Mobile Processors

There is a growing need for fast and power-efficient computer vision on embedded devices. This session will focus on computer vision capabilities on embedded platforms available to ADAS developers, covering the OpenCV CUDA implementation and the new computer vision standard, OpenVX. In addition, Itseez traffic sign detection will be showcased. The algorithm is capable of

Real-Time Traffic Sign Recognition on Mobile Processors Read More +

johnday-blog

Improved Vision Processors, Sensors Enable Proliferation of New and Enhanced ADAS Functions

This article was originally published at John Day's Automotive Electronics News. It is reprinted here with the permission of JHDay Communications. Thanks to the emergence of increasingly capable and cost-effective processors, image sensors, memories and other semiconductor devices, along with robust algorithms, it's now practical to incorporate computer vision into a wide range of embedded

Improved Vision Processors, Sensors Enable Proliferation of New and Enhanced ADAS Functions Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm

Francis MacDougall, Senior Director of Technology at Qualcomm, presents the "Vision-Based Gesture User Interfaces" tutorial within the "Vision Applications" technical session at the October 2013 Embedded Vision Summit East. MacDougall explains how gestures fit into the spectrum of advanced user interface options, compares and contrasts the various 2-D and 3-D technologies (vision and other) available

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical

Michael Tusch, Founder and CEO of Apical Imaging, presents the "Better Image Understanding Through Better Sensor Understanding" tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. One of the main barriers to widespread use of embedded vision is its reliability. For example, systems which detect

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms,” Tom Michiels, Synopsys

Tom Michiels, R&D Manager at Synopsys, presents the "Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms" tutorial within the "Algorithms and Implementations" technical session at the October 2013 Embedded Vision Summit East. Pedestrian detection is an important function in a wide range of applications, including automotive safety systems, mobile applications, and industrial automation. A

October 2013 Embedded Vision Summit Technical Presentation: “Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms,” Tom Michiels, Synopsys Read More +

September 2013 Qualcomm UPLINQ Conference Presentation: “Accelerating Computer Vision Applications with the Hexagon DSP,” Eric Gregori, BDTI

Eric Gregori, Senior Software Engineer at BDTI, presents the "Accelerating Computer Vision Applications with the Hexagon DSP" tutorial at the September 2013 Qualcomm UPLINQ Conference. Smartphones, tablets and embedded systems increasingly use sophisticated vision algorithms to deliver capabilities like augmented reality and gesture user interfaces. Since vision algorithms are computationally demanding, a key challenge when

September 2013 Qualcomm UPLINQ Conference Presentation: “Accelerating Computer Vision Applications with the Hexagon DSP,” Eric Gregori, BDTI Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, Texas Instruments

Goksel Dedeoglu, Embedded Vision R&D Manager at Texas Instruments, presents the "Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It" tutorial within the "Algorithms and Implementations" technical session at the October 2013 Embedded Vision Summit East. This tutorial is intended for technical audiences interested in learning about the Lucas-Kanade

October 2013 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, Texas Instruments Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Feature Detection: How It Works, When to Use It, and a Sample Implementation,” Marco Jacobs, videantis

Marco Jacobs, Technical Marketing Director at videantis, presents the "Feature Detection: How It Works, When to Use It, and a Sample Implementation" tutorial within the "Object and Feature Detection" technical session at the October 2013 Embedded Vision Summit East. Feature detection and tracking are key components of many computer vision applications. In this talk, Jacobs

October 2013 Embedded Vision Summit Technical Presentation: “Feature Detection: How It Works, When to Use It, and a Sample Implementation,” Marco Jacobs, videantis Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top