Automotive

jhh-mercedes_600

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles

NVIDIA’s Jensen Huang, Mercedes’ Sajjad Khan unveil vision for software-defined AI cars integrating self-driving, intelligent cockpits. Mercedes-Benz announced today it has selected NVIDIA to help realize its vision for next-generation vehicles. Speaking to a packed crowd at the Mercedes-Benz booth on the first day of CES 2019, Mercedes-Benz Executive Vice President Sajjad Khan and NVIDIA […]

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles Read More +

“Understanding Automotive Radar: Present and Future,” a Presentation from NXP Semiconductors

Arunesh Roy, Radar Algorithms Architect at NXP Semiconductors, presents the “Understanding Automotive Radar: Present and Future” tutorial at the May 2018 Embedded Vision Summit. Thanks to its proven, all-weather range detection capability, radar is increasingly used for driver assistance functions such as automatic emergency braking and adaptive cruise control. Radar is considered a crucial sensing

“Understanding Automotive Radar: Present and Future,” a Presentation from NXP Semiconductors Read More +

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the “Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge” tutorial at the May 2018 Embedded Vision Summit. Regardless of the processing topology—distributed, centralized or hybrid —sensor processing in automotive is an edge compute problem. However, with

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors Read More +

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux

Felix Heide, CTO and Co-founder of Algolux, presents the “Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020” tutorial at the May 2018 Embedded Vision Summit. ADAS and autonomous driving systems rely on sophisticated sensor, image processing and neural-network based perception technologies. This has resulted in effective driver assistance capabilities and

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux Read More +

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch

Markus Tremmel, Chief Expert for ADAS at Bosch, presents the “Computer Vision Hardware Acceleration for Driver Assistance” tutorial at the May 2018 Embedded Vision Summit. With highly automated and fully automated driver assistance system just around the corner, next generation ADAS sensors and central ECUs will have much higher safety and functional requirements to cope

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch Read More +

“How Simulation Accelerates Development of Self-Driving Technology,” a Presentation from AImotive

László Kishonti, founder and CEO of AImotive, presents the “How Simulation Accelerates Development of Self-Driving Technology” tutorial at the May 2018 Embedded Vision Summit. Virtual testing, as discussed by Kishonti in this presentation, is the only solution that scales to address the billions of miles of testing required to make autonomous vehicles robust. However, integrating

“How Simulation Accelerates Development of Self-Driving Technology,” a Presentation from AImotive Read More +

“Designing Smarter, Safer Cars with Embedded Vision Using EV Processor Cores,” a Presentation from Synopsys

Fergus Casey, R&D Director for ARC Processors at Synopsys, presents the “Designing Smarter, Safer Cars with Embedded Vision Using Synopsys EV Processor Cores” tutorial at the May 2018 Embedded Vision Summit. Consumers, the automotive industry and government regulators are requiring greater levels of automotive functional safety with each new generation of cars. Embedded vision, using

“Designing Smarter, Safer Cars with Embedded Vision Using EV Processor Cores,” a Presentation from Synopsys Read More +

May 2018 Embedded Vision Summit Slides

The Embedded Vision Summit was held on May 21-24, 2018 in Santa Clara, California, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in… May 2018 Embedded Vision Summit

May 2018 Embedded Vision Summit Slides Read More +

Computer Vision in Surround View Applications

The ability to "stitch" together (offline or in real-time) multiple images taken simultaneously by multiple cameras and/or sequentially by a single camera, in both cases capturing varying viewpoints of a scene, is becoming an increasingly appealing (if not necessary) capability in an expanding variety of applications. High quality of results is a critical requirement, one

Computer Vision in Surround View Applications Read More +

Figure2

Stereo Vision: Facing the Challenges and Seeing the Opportunities for ADAS Applications

This technical article was originally published on Texas Instruments' website (PDF). It is reprinted here with the permission of Texas Instruments. Introduction Cameras are the most precise mechanisms used to capture accurate data at high resolution. Like human eyes, cameras capture the resolution, minutiae and vividness of a scene with such beautiful detail that no

Stereo Vision: Facing the Challenges and Seeing the Opportunities for ADAS Applications Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top