fbpx

Object Tracking

Object Tracking Functions

Case History: Florim

This blog post was originally published by Onit. It is reprinted here with the permission of Onit. 80 Forklifts for Indoor and Outdoor Logistics Florim is a multinational company recognized in the production of ceramic surfaces. With an innate passion for beauty and design, Florim has been producing ceramic surfaces for every building, architecture and

Read More »

Embodied AI: How Do AI-powered Robots Perceive the World?

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. While robots have proliferated in recent years in smart cities, factories and homes, we are mostly interacting with robots controlled by classical handcrafted algorithms. These are robots that have a narrow goal and don’t learn from their

Read More »

The Role of the Inertial Measurement Unit in 3D Time-of-flight Cameras

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. An Inertial Measurement Unit (IMU) detects movements and rotations across six degrees of freedom, representing the types of motion a system can experience. When paired with Time-of-flight (ToF) cameras, it ensures accurate spatial understanding and

Read More »

Modern Vehicles See More with Computer Vision

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. The Snapdragon Ride Vision System is designed to enhance vehicle perception for safer driving experiences Today’s drivers reap the benefits of active safety features in their vehicles. Automatic emergency braking, lane departure warnings, blind spot detection and

Read More »

Heart Rate Detection with Open CV

This blog post was originally published at Digica’s website. It is reprinted here with the permission of Digica. Heart rate detection with Open CV It’s probably no surprise to you that heart rate can be measured using different gadgets like smartphones or smartwatches. But did you know you can measure it using just the  camera

Read More »

“Understanding, Selecting and Optimizing Object Detectors for Edge Applications,” a Presentation from Walmart Global Tech

Md Nasir Uddin Laskar, Staff Machine Learning Engineer at Walmart Global Tech, presents the “Understanding, Selecting and Optimizing Object Detectors for Edge Applications” tutorial at the May 2023 Embedded Vision Summit. Object detectors count objects in a scene and determine their precise locations, while also labeling them. Object detection plays… “Understanding, Selecting and Optimizing Object

Read More »

“Vision-language Representations for Robotics,” a Presentation from the University of Pennsylvania

Dinesh Jayaraman, Assistant Professor at the University of Pennsylvania, presents the “Vision-language Representations for Robotics” tutorial at the May 2023 Embedded Vision Summit. In what format can an AI system best present what it “sees” in a visual scene to help robots accomplish tasks? This question has been a long-standing… “Vision-language Representations for Robotics,” a

Read More »

Dragonfly Base: Enhancing Indoor Localization in Challenging Environments with Visual Markers

We’re thrilled to share the second episode of our Dragonfly video series. In this video, we delve deeper into Dragonfly’s capabilities, specifically focusing on how we enhance indoor localization under challenging conditions. Key Takeaways: The significance of visual markers in Computer Vision and Visual SLAM. Our ingenious solution—visual markers on the ceiling—ensuring consistent and accurate

Read More »

“Introduction to Modern LiDAR for Machine Perception,” a Presentation from the University of Ottawa

Robert Laganière, Professor at the University of Ottawa and CEO of Sensor Cortek, presents the “Introduction to Modern LiDAR for Machine Perception” tutorial at the May 2023 Embedded Vision Summit. In this presentation, Laganière provides an introduction to light detection and ranging (LiDAR) technology. He explains how LiDAR sensors work… “Introduction to Modern LiDAR for

Read More »

Service Robots – Our New and Efficient Coworkers

When picturing a robot, an intimidating human-like figure might come to mind, but what if they could help businesses gain maximum benefit from their unmatched efficiency and power? Service robots are technological marvels proving to possess the enhanced capabilities of human labor, with fantastic flexibility and unmatched power, while always looking smart. The most common

Read More »

“Computer Vision in Sports: Scalable Solutions for Downmarkets,” a Presentation from Sportlogiq

Mehrsan Javan, Co-founder and CTO of Sportlogiq, presents the “Computer Vision in Sports: Scalable Solutions for Downmarket Leagues” tutorial at the May 2023 Embedded Vision Summit. Sports analytics is about observing, understanding and describing the game in an intelligent manner. In practice, this requires a fully automated, robust end-to-end pipeline,… “Computer Vision in Sports: Scalable

Read More »

“A Computer Vision System for Autonomous Satellite Maneuvering,” a Presentation from SCOUT Space

Andrew Harris, Spacecraft Systems Engineer at SCOUT Space, presents the “Developing a Computer Vision System for Autonomous Satellite Maneuvering” tutorial at the May 2023 Embedded Vision Summit. Computer vision systems for mobile autonomous machines experience a wide variety of real-world conditions and inputs that can be challenging to capture accurately… “A Computer Vision System for

Read More »

“Sensor Fusion Techniques for Accurate Perception of Objects in the Environment,” a Presentation from the Sanborn Map Company

Baharak Soltanian, Vice President of Research and Development for the Sanborn Map Company, presents the “Sensor Fusion Techniques for Accurate Perception of Objects in the Environment” tutorial at the May 2023 Embedded Vision Summit. Increasingly, perceptual AI is being used to enable devices and systems to obtain accurate estimates of… “Sensor Fusion Techniques for Accurate

Read More »

“Developing an Embedded Vision AI-powered Fitness System,” a Presentation from Peloton Interactive

Sanjay Nichani, Vice President for Artificial Intelligence and Computer Vision at Peloton Interactive, presents the “Developing an Embedded Vision AI-powered Fitness System” tutorial at the May 2023 Embedded Vision Summit. The Guide is Peloton’s first strength-training product that runs on a physical device and also the first that uses AI… “Developing an Embedded Vision AI-powered

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top