Automotive Applications for Embedded Vision
Vision products in automotive applications can make us better and safer drivers
Vision products in automotive applications can serve to enhance the driving experience by making us better and safer drivers through both driver and road monitoring.
Driver monitoring applications use computer vision to ensure that driver remains alert and awake while operating the vehicle. These systems can monitor head movement and body language for indications that the driver is drowsy, thus posing a threat to others on the road. They can also monitor for driver distraction behaviors such as texting, eating, etc., responding with a friendly reminder that encourages the driver to focus on the road instead.
In addition to monitoring activities occurring inside the vehicle, exterior applications such as lane departure warning systems can use video with lane detection algorithms to recognize the lane markings and road edges and estimate the position of the car within the lane. The driver can then be warned in cases of unintentional lane departure. Solutions exist to read roadside warning signs and to alert the driver if they are not heeded, as well as for collision mitigation, blind spot detection, park and reverse assist, self-parking vehicles and event-data recording.
Eventually, this technology will to lead cars with self-driving capability; Google, for example, is already testing prototypes. However many automotive industry experts believe that the goal of vision in vehicles is not so much to eliminate the driving experience but to just to make it safer, at least in the near term.

Unleashing LiDAR’s Potential: A Conversation with Innovusion
This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. The market for LiDAR in automotive applications is expected to reach US$3.9 billion in 2028 from US$169 million in 2022, representing a 69% Compound Annual Growth Rate (CAGR). According to Yole Intelligence’s

“Reinventing Smart Cities with Computer Vision,” a Presentation from Hayden AI
Vaibhav Ghadiok, Co-founder and CTO of Hayden AI, presents the “Reinventing Smart Cities with Computer Vision” tutorial at the May 2023 Embedded Vision Summit. Hayden AI has developed the first AI-powered data platform for smart and safe city applications such as traffic enforcement, parking and asset management. In this talk,… “Reinventing Smart Cities with Computer

AI and the Road to Full Autonomy in Autonomous Vehicles
The road to fully autonomous vehicles is, by necessity, a long and winding one; systems that implement new technologies that increase the driving level of vehicles (driving levels being discussed further below) must be rigorously tested for safety and longevity before they can make it to vehicles that are bound for public streets. The network

What is the Role of Multi-camera Solutions in Surround-view Systems?
This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. The rise of multi-camera systems has helped traditional automobiles and autonomous vehicles to eliminate blind spots with a comprehensive view of their surroundings. Discover how surround-view systems work and unearth insights on all the considerations

The Global Market for Lidar in Autonomous Vehicles Will Grow to US$8.4 Billion by 2033
The demand for lidars to be adopted in the automotive industry drives huge investment and rapid progression, with innovations in beam steering technologies, performance improvement, and cost reduction in lidar transceiver components. These efforts can enable lidars to be implemented in a wider application scenario beyond conventional usage and automobiles. However, the rapidly evolving lidar

Autonomous Vehicles Will Soon Be Safer Than Humans, and Some Already Are
The promise of autonomous vehicles has been a long time coming. While many are still waiting to see the fruits of all this work, there are some cities like Arizona and San Francisco where autonomous cars are starting to become a reality. Furthermore, IDTechEx’s new industry report “Autonomous Cars, Robotaxis and Sensors 2024-2044” predicts a

Mark AB Capital Partners with Blaize In an Exclusive Relationship to Set Up a State-of-the-art Facility in Abu Dhabi to Provide Sustainable Edge AI Solutions for UAE
Initial contracts to deliver sustainable Smart Cities and Airport solutions projected to generate a minimum of $50m in orders annually EL DORADO HILLS, CA — September 5, 2023 — Mark AB Capital today announced a multi-year Memorandum of Understanding with Blaize – the leader of new-generation supercomputing. Blaize will offer a comprehensive AI edge hardware

LiDAR Systems for the Automotive Industry: TRIOPTICS’ Measurement Technology Enables Large-scale Production
This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. Alongside camera and radar, LiDAR sensors are among the key technologies for highly automated, fully automated, and autonomous driving. Together with camera and radar sensors, the LiDAR sensors perceive the surroundings, detect

Edge AI: The Wait is (Almost) Over
Since the introduction of Artificial Intelligence to the data center, AI has been loath to leave it. With large tracts of floorspace dedicated to servers comprising leading-edge chips that can handle the computational demands for training the latest in AI models, as well as inference via end-user connections to the cloud, data centers are the

Everything You Need to Know About Split-pixel HDR Technology
This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Split-pixel HDR technology is a game-changer in embedded vision, allowing camera systems to capture a broader range of brightness levels for more vibrant and true-to-life images. Explore the two HDR modes, see how split-pixel HDR