Automotive Applications for Embedded Vision
Vision products in automotive applications can make us better and safer drivers
Vision products in automotive applications can serve to enhance the driving experience by making us better and safer drivers through both driver and road monitoring.
Driver monitoring applications use computer vision to ensure that driver remains alert and awake while operating the vehicle. These systems can monitor head movement and body language for indications that the driver is drowsy, thus posing a threat to others on the road. They can also monitor for driver distraction behaviors such as texting, eating, etc., responding with a friendly reminder that encourages the driver to focus on the road instead.
In addition to monitoring activities occurring inside the vehicle, exterior applications such as lane departure warning systems can use video with lane detection algorithms to recognize the lane markings and road edges and estimate the position of the car within the lane. The driver can then be warned in cases of unintentional lane departure. Solutions exist to read roadside warning signs and to alert the driver if they are not heeded, as well as for collision mitigation, blind spot detection, park and reverse assist, self-parking vehicles and event-data recording.
Eventually, this technology will to lead cars with self-driving capability; Google, for example, is already testing prototypes. However many automotive industry experts believe that the goal of vision in vehicles is not so much to eliminate the driving experience but to just to make it safer, at least in the near term.
What’s New: The Institute of Electrical and Electronics Engineers (IEEE) has approved a proposal to develop a standard for safety considerations in automated vehicle (AV) decision-making and named Intel Senior Principal Engineer Jack Weast to lead the workgroup. Participation in the workgroup is open to companies across the AV industry, and Weast hopes for broad
“Improving the Safety and Performance of Automated Vehicles Through Precision Localization,” a Presentation from VSI Labs
Phil Magney, founder of VSI Labs, presents the “Improving the Safety and Performance of Automated Vehicles Through Precision Localization” tutorial at the May 2019 Embedded Vision Summit. How does a self-driving car know where it is? Magney explains how autonomous vehicles localize themselves against their surroundings through the use of a variety of sensors along
Gergely Debreczeni, Chief Scientist at AImotive, presents the “Distance Estimation Solutions for ADAS and Automated Driving” tutorial at the May 2019 Embedded Vision Summit. Distance estimation is at the heart of automotive driver assistance systems (ADAS) and automated driving (AD). Simply stated, safe operation of vehicles requires robust distance estimation. Many different types of sensors
“Can We Have Both Safety and Performance in AI for Autonomous Vehicles?,” a Presentation from Codeplay Software
Andrew Richards, CEO and Co-founder of Codeplay Software, presents the “Can We Have Both Safety and Performance in AI for Autonomous Vehicles?” tutorial at the May 2019 Embedded Vision Summit. The need for ensuring safety in AI subsystems within autonomous vehicles is obvious. How to achieve it is not. Standard safety engineering tools are designed
Tom Wilson, Vice President of Automotive at Graphcore, presents the “DNN Challenges and Approaches for L4/L5 Autonomous Vehicles” tutorial at the May 2019 Embedded Vision Summit. The industry has made great strides in development of L4/L5 autonomous vehicles, but what’s available today falls far short of expectations set as recently as two to three years
David Julian, CTO and Founder of Netradyne, presents the “Addressing Corner Cases in Embedded Computer Vision Applications” tutorial at the May 2019 Embedded Vision Summit. Many embedded vision applications require solutions that are robust in the face of very diverse real-world inputs. For example, in automotive applications, vision-based safety systems may encounter unusual configurations of
“What’s Changing in Autonomous Vehicle Investments Worldwide — and Why?,” a Presentation from Woodside Capital Partners
Rudy Burger, Managing Partner at Woodside Capital Partners, presents the "What’s Changing in Autonomous Vehicle Investments Worldwide—and Why?" tutorial at the May 2019 Embedded Vision Summit. So far, over $100B has been invested by industry into the development of autonomous vehicles (AVs), and the pace of investment has recently accelerated. In this talk, Burger presents
Burkhard Huhnke, Vice President of Automotive Strategy for Synopsys, presents the "Making Cars That See—Failure is Not an Option" tutorial at the May 2019 Embedded Vision Summit. Drivers are the biggest source of uncertainty in the operation of cars. Computer vision is helping to eliminate human error and make the roads safer. But 14 years
Ian Riches, Executive Director for Global Automotive Practice at Strategy Analytics, presents the "Automotive Vision Systems— Seeing the Way Forward" tutorial at the May 2019 Embedded Vision Summit. It was not long ago that cameras were a rarity on all but luxury cars. In 2018, as many automotive cameras were shipped as were vehicles! Riches'
László Kishonti, CEO of AImotive, presents the "Shifts in the Automated Driving Industry" tutorial at the May 2019 Embedded Vision Summit. 2018 will have a lasting effect on the self-driving industry, as key stakeholders have turned from the unattainable goal of full autonomy by 2021 to more realistic development and productization roadmaps. This will in