PROVIDER

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research

Mark Bünger, VP of Research at Lux Research, presents the "Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry" tutorial at the May 2017 Embedded Vision Summit. The auto and telecom industries have been dreaming of connected cars for twenty years, but their results have been mediocre and mixed. Now, just […]

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research Read More +

EVA180x100

Embedded Vision Insights: August 18, 2017 Edition

LETTER FROM THE EDITOR Dear Colleague, TensorFlow has become a popular framework for creating machine learning-based computer vision applications, especially for the development of deep neural networks. If you’re planning to develop computer vision applications using deep learning and want to understand how to use TensorFlow to do it, then don’t miss an upcoming full-day,

Embedded Vision Insights: August 18, 2017 Edition Read More +

NCS_Summary

The Evolution of Deep Learning for ADAS Applications

This technical article was originally published at Synopsys' website. It is reprinted here with the permission of Synopsys. Embedded vision solutions will be a key enabler for making automobiles fully autonomous. Giving an automobile a set of eyes – in the form of multiple cameras and image sensors – is a first step, but it

The Evolution of Deep Learning for ADAS Applications Read More +

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics

Jessica Gehlhar, Vision Solutions Engineer at Edmund Optics, presents the “Introduction to Optics for Embedded Vision” tutorial at the May 2017 Embedded Vision Summit. This talk provides an introduction to optics for embedded vision system and algorithm developers. Gehlhar begins by presenting fundamental imaging lens specifications and quality metrics. She explains key parameters and concepts

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics Read More +

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners

Rudy Burger, Managing Partner at Woodside Capital Partners, presents the "What’s Hot? The M&A and Funding Landscape for Computer Vision Companies" tutorial at the May 2017 Embedded Vision Summit. The six primary markets driving computer vision are automotive, sports and entertainment, consumer and mobile, robotics and machine vision, medical, and security and surveillance. This presentation

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners Read More +

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis

Marco Jacobs, VP of Marketing at videantis, presents the "Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs" tutorial at the May 2017 Embedded Vision Summit. 360-degree video systems use multiple cameras to capture a complete view of their surroundings. These systems are being adopted in cars, drones, virtual reality, and online streaming systems. At first

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis Read More +

Amnon-Shashua-300x200

Intel Kick-Starts Mobileye Integration with Plans to Build Fleet of 100 L4 Autonomous Test Cars

With the completion of the tender offer of Mobileye, Intel is poised to accelerate its autonomous driving business from car-to-cloud. Mobileye, an Intel Company, will start building a fleet of fully autonomous (level 4 SAE) vehicles for testing in the United States, Israel and Europe. The first vehicles will be deployed later this year, and

Intel Kick-Starts Mobileye Integration with Plans to Build Fleet of 100 L4 Autonomous Test Cars Read More +

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google

Jay Yagnik, Head of Machine Perception Research at Google, presents the "Rapid Evolution and Future of Machine Perception" tutorial at the May 2017 Embedded Vision Summit. With the advent of deep learning, our ability to build systems that derive insights from perceptual data has increased dramatically. Perceptual data dwarfs almost all other data sources in

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top