On December 15, 2020 at 9 am PT (noon ET), Pierrick Boulay, Market and Technology Analyst at Yole Développement, will present the free hour webinar “Sensor Fusion for Autonomous Vehicles,” organized by the Edge AI and Vision Alliance. Here’s the description, from the event registration page:
Advanced Driving Assistance Systems (ADAS) are based on a combination of sensors and electronic control units (ECUs). These systems have proven to reduce road fatalities, alert the driver to potential problems and avoid collisions. ADAS relies primarily on radar and cameras, along with the necessary computing resources to process the data generated by these sensors.
The recent availability of even more powerful computing chips and sensors has enabled the development of even more advanced functions, expanding beyond safety assistance to incorporate increasingly automated driving capabilities. At the sensor level, for example, some OEMs are incorporating LiDAR in addition to radars and cameras. The implementation of these autonomous features requires the use of more sensors, more computing power and a more complex electric/electronic (E/E) system architecture.
Traditional vehicles (cars, trucks, etc.) are also now not the only platforms that are becoming autonomous; a growing number of autonomous industrial devices are being developed, for example. These include guided vehicles in warehouses and ports, forklifts, trucks, cranes, ships, last-mile delivery robots and delivery drones.
In this presentation, leading market research firm Yole Développement will describe the increasing need of, along with the “fusion” coordination of, sensors for autonomous devices for both automotive and industrial applications. The presentation will cover topics such as cameras, radar, LiDAR, E/E architectures and domain controllers. A question-and-answer session will follow the presentation.