APPLICATIONS

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be […]

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies

Rafal Malewski, Head of the Graphics Technology Engineering Center at NXP Semiconductors, and Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, present the "Implementing an Optimized CNN Traffic Sign Recognition Solution" tutorial at the May 2017 Embedded Vision Summit. Now that the benefits of using deep neural networks for image classification are well known, the

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies Read More +

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies

Rafal Malewski, Head of the Graphics Technology Engineering Center at NXP Semiconductors, and Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, present the "Implementing an Optimized CNN Traffic Sign Recognition Solution" tutorial at the May 2017 Embedded Vision Summit. Now that the benefits of using deep neural networks for image classification are well known, the

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies Read More +

“Developing Real-time Video Applications with CoaXPress,” A Presentation from Euresys

Jean-Michel Wintgens, Vice President of Engineering at Euresys, presents the "Developing Real-time Video Applications with CoaXPress" tutorial at the May 2017 Embedded Vision Summit. CoaXPress is a modern, high-performance video transport interface. Using a standard coaxial cable, it provides a point-to-point connection that is reliable, scalable and versatile. Wintgens shows, using real application cases and

“Developing Real-time Video Applications with CoaXPress,” A Presentation from Euresys Read More +

“Moving CNNs from Academic Theory to Embedded Reality,” a Presentation from Synopsys

Tom Michiels, System Architect for Embedded Vision Processors at Synopsys, presents the "Moving CNNs from Academic Theory to Embedded Reality" tutorial at the May 2017 Embedded Vision Summit. In this presentation, you will learn to recognize and avoid the pitfalls of moving from an academic CNN/deep learning graph to a commercial embedded vision design. You

“Moving CNNs from Academic Theory to Embedded Reality,” a Presentation from Synopsys Read More +

“A Multi-purpose Vision Processor for Embedded Systems,” a Presentation from Allied Vision

Michael Melle, Sales Development Manager at Allied Vision, and Felix Nikolaus, Firmware Designer at Allied Vision, presents the "A Multi-purpose Vision Processor for Embedded Systems" tutorial at the May 2017 Embedded Vision Summit. This presentation gives an overview of an innovative vision processor that delivers the superior image quality of industrial cameras while enabling the

“A Multi-purpose Vision Processor for Embedded Systems,” a Presentation from Allied Vision Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top