APPLICATIONS

“Next-generation Computer Vision Methods for Automated Navigation of Unmanned Aircraft,” a Presentation from Immervision

Julie Buquet, Applied Researcher for Imaging and AI at Immervision, presents the “Next-generation Computer Vision Methods for Automated Navigation of Unmanned Aircraft” tutorial at the May 2023 Embedded Vision Summit. Unmanned aircraft systems (UASs) need to perform accurate autonomous navigation using sense-and-avoid algorithms under varying illumination conditions. This requires robust algorithms able to perform consistently, […]

“Next-generation Computer Vision Methods for Automated Navigation of Unmanned Aircraft,” a Presentation from Immervision Read More +

LiDAR Systems for the Automotive Industry: TRIOPTICS’ Measurement Technology Enables Large-scale Production

This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. Alongside camera and radar, LiDAR sensors are among the key technologies for highly automated, fully automated, and autonomous driving. Together with camera and radar sensors, the LiDAR sensors perceive the surroundings, detect

LiDAR Systems for the Automotive Industry: TRIOPTICS’ Measurement Technology Enables Large-scale Production Read More +

How are Multi-camera Systems Used in Delivery Robots, and What Makes Them So Effective?

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Delivery robots offer future-proofed solutions for last-mile deliveries. They enable businesses to streamline operations and ensure fast, reliable deliveries – leading to happy customers. Explore why these robots need a synchronized multi-camera system and how

How are Multi-camera Systems Used in Delivery Robots, and What Makes Them So Effective? Read More +

Edge AI: The Wait is (Almost) Over

Since the introduction of Artificial Intelligence to the data center, AI has been loath to leave it. With large tracts of floorspace dedicated to servers comprising leading-edge chips that can handle the computational demands for training the latest in AI models, as well as inference via end-user connections to the cloud, data centers are the

Edge AI: The Wait is (Almost) Over Read More +

Renesas Demonstration of Its High Performance, Fast Boot, Low Power AI Trail Camera Winning Combo

Michael Kosinski, Senior Embedded Software Engineer at Renesas Electronics, demonstrates the company’s latest edge AI and vision technologies and products at the 2023 Embedded Vision Summit. Specifically, Kosinski demonstrates Renesas’ Winning Combination PoC (Proof of Concept), designed to deliver a low-power AI solution for battery-based trail camera, video doorbell and video security applications. This PoC

Renesas Demonstration of Its High Performance, Fast Boot, Low Power AI Trail Camera Winning Combo Read More +

Everything You Need to Know About Split-pixel HDR Technology

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Split-pixel HDR technology is a game-changer in embedded vision, allowing camera systems to capture a broader range of brightness levels for more vibrant and true-to-life images. Explore the two HDR modes, see how split-pixel HDR

Everything You Need to Know About Split-pixel HDR Technology Read More +

Enhancing Object Detection: The Impact of Visidon CNN-based Noise Reduction

This blog post was originally published at Visidon’s website. It is reprinted here with the permission of Visidon. In the realm of computer vision, object detection plays a vital role in various applications, including surveillance systems, autonomous driving, and image recognition. However, accurate object detection can be challenging in real-world scenarios due to the presence

Enhancing Object Detection: The Impact of Visidon CNN-based Noise Reduction Read More +

NXP Demonstration of a Driver Monitoring System Enabled By the i.MX 93 SoC and Its Integrated NPU

Michael Pontikes, Systems Engineer at NXP Semiconductors, demonstrates the company’s latest edge AI and vision technologies and products at the 2023 Embedded Vision Summit. Specifically, Pontikes demonstrates unique features of the new i.MX 93 applications processor. This latest processor from NXP Semiconductors, continuing the company’s long legacy of i.MX applications processors, features the Arm Ethos

NXP Demonstration of a Driver Monitoring System Enabled By the i.MX 93 SoC and Its Integrated NPU Read More +

Nota AI Demonstration of Revolutionizing Intelligent Transportation Systems with Nota ITS

YooChan Kim, Product Marketing Manager at Nota America Inc., demonstrates the company’s latest edge AI and vision technologies and products at the 2023 Embedded Vision Summit. Specifically, Kim demonstrates Nota ITS, a lightweight AI model-based approach for traffic analysis and revolutionizing intelligent transportation systems. Nota ITS includes AI Traffic Cameras, AI Traffic Signal Control, AI

Nota AI Demonstration of Revolutionizing Intelligent Transportation Systems with Nota ITS Read More +

What is LED Flicker Mitigation and Why is It a Critical Camera Feature?

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. LED flicker can impact the performance of embedded vision systems. To address this, LED Flicker Mitigation (LFM) is crucial for applications like AMRs, ADAS, fleet management systems, etc. In this blog post, you’ll learn more

What is LED Flicker Mitigation and Why is It a Critical Camera Feature? Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top