FUNCTIONS

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks

Avinash Nehemiah, Product Marketing Manager for Computer Vision at MathWorks, presents the "How to Test and Validate an Automated Driving System" tutorial at the May 2017 Embedded Vision Summit. Have you ever wondered how ADAS and autonomous driving systems are tested? Automated driving systems combine a diverse set of technologies and engineering skill sets from […]

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks Read More +

“Approaches for Vision-based Driver Monitoring,” a Presentation from PathPartner Technology

Jayachandra Dakala, Technical Architect at PathPartner Technology, presents the "Approaches for Vision-based Driver Monitoring" tutorial at the May 2017 Embedded Vision Summit. Since many road accidents are caused by driver inattention, assessing driver attention is important to preventing accidents. Distraction caused by other activities and sleepiness due to fatigue are the main causes of driver

“Approaches for Vision-based Driver Monitoring,” a Presentation from PathPartner Technology Read More +

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA

Max Larin, CEO of XIMEA, presents the "PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Larin provides an overview of existing camera interfaces for embedded systems and explores their strengths and weaknesses.  He also examines the differences between integration of a sensor

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA Read More +

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec

Olaf Munkelt, Co-founder and Managing Director at MVTec Software GmbH, presents the "Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Munkelt demonstrates how easy it is to develop an embedded vision (identification) application based on the HALCON Embedded standard software

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec Read More +

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober

Engineer, inventor and YouTube personality Mark Rober presents the "Adventures in DIY Embedded Vision: The Can’t-miss Dartboard" tutorial at the May 2017 Embedded Vision Summit. Can a mechanical engineer with no background in computer vision build a complex, robust, real-time computer vision system? Yes, with a little help from his friends. Rober fulfilled a three-year

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober Read More +

“Performing Multiple Perceptual Tasks With a Single Deep Neural Network,” a Presentation from Magic Leap

Andrew Rabinovich, Director of Deep Learning at Magic Leap, presents the "Performing Multiple Perceptual Tasks With a Single Deep Neural Network" tutorial at the May 2017 Embedded Vision Summit. As more system developers consider incorporating visual perception into smart devices such as self-driving cars, drones and wearable computers, attention is shifting toward practical formulation and

“Performing Multiple Perceptual Tasks With a Single Deep Neural Network,” a Presentation from Magic Leap Read More +

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight

Boris Babenko, Senior Software Engineer at Orbital Insight, presents the "Using Satellites to Extract Insights on the Ground" tutorial at the May 2017 Embedded Vision Summit. Satellites are great for seeing the world at scale, but analyzing petabytes of images can be extremely time-consuming for humans alone. This is why machine vision is a perfect

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight Read More +

“Designing Vision Systems for Human Operators and Workflows: A Case Study,” a Presentation from 8tree

Arun Chhabra, CEO of 8tree, presents the "Designing Vision Systems for Human Operators and Workflows: A Case Study" tutorial at the May 2017 Embedded Vision Summit. During the past several decades, manual measurement methods – using rulers and dial gauges – have been the status quo for inspecting dents, bumps, lightning strikes and corrosion blend-out

“Designing Vision Systems for Human Operators and Workflows: A Case Study,” a Presentation from 8tree Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top