fbpx

Object Tracking

Oculi Enables Near-zero Lag Performance with an Embedded Solution for Gesture Control

Immersive extended reality (XR) experiences let users seamlessly interact with virtual environments. These experiences require real-time gesture control and eye tracking while running in resource-constrained environments such as on head-mounted displays (HMDs) and smart glasses. These capabilities are typically implemented using computer vision technology, with imaging sensors that generate lots of data to be moved

Oculi Enables Near-zero Lag Performance with an Embedded Solution for Gesture Control Read More +

Renesas Demonstration of the RZ/V Microprocessor with a Power-efficient AI Accelerator

Manny Singh, Principal Product Marketing Manager at Renesas, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Singh demonstrates the company’s the RZ/V microprocessor with a power-efficient AI accelerator. In this demo of object detection and recognition on Renesas’ proprietary Dynamically Reconfigurable Processor (DRP-AI), Singh shows

Renesas Demonstration of the RZ/V Microprocessor with a Power-efficient AI Accelerator Read More +

Ready for Air: Welcome to a New Dawn for Aerial Autonomy

This blog post was originally published at Opteran Technologies’ website. It is reprinted here with the permission of Opteran Technologies. Opteran enables robust, fast, GPS free, verifiable autonomy for aerial systems, at previously unimaginable size, weight (~30g), power (~3W) and hardware costs using consumer 2D cameras Today we’re happy to share a first glance at our

Ready for Air: Welcome to a New Dawn for Aerial Autonomy Read More +

Oculi Demonstration of Real-time Depth Using Stereo Vision with Smart Events from the OCULI SPU

In this demonstration, a pair of OCULI SPU devices measure the distance to a moving object (a ball in this case) in real-time. This capability requires high dynamic range, high speed sampling (700 Hz in this case) and tight synchronization, all of which are made easy with the OCULI SPU.

Oculi Demonstration of Real-time Depth Using Stereo Vision with Smart Events from the OCULI SPU Read More +

LAON PEOPLE Demonstration of Traffic Analysis Using a Deep Learning Solution

Luke Faubion, Traffic Solution Director at LAON PEOPLE, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Faubion demonstrates traffic analysis using the company’s deep learning solution. The traffic analysis program Faubion demonstrates doesn’t require installing a new IP camera. LAON PEOPLE’s AI solution provides vehicle,

LAON PEOPLE Demonstration of Traffic Analysis Using a Deep Learning Solution Read More +

Coherent Logix Demonstration of Ultra-low Latency Industrial Inspection at the Edge Using the HyperX Processor

Martin Hunt, Director of Applications Engineering at Coherent Logix, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Hunt demonstrates ultra-low latency industrial inspection at the edge using the company’s HyperX processor. In this demo, Hunt shows how to use the HyperX Memory Network parallel processor

Coherent Logix Demonstration of Ultra-low Latency Industrial Inspection at the Edge Using the HyperX Processor Read More +

Algolux Demonstration of Highly Robust Computer Vision for All Conditions with the Eos Embedded Perception Software

Dave Tokic, Vice President of Marketing and Strategic Partnerships at Algolux, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Tokic demonstrates highly robust computer vision for all conditions with the Eos embedded perception software. The Eos end-to-end vision architecture enables co-design of imaging and detection

Algolux Demonstration of Highly Robust Computer Vision for All Conditions with the Eos Embedded Perception Software Read More +

Synopsys Demonstration of SLAM Acceleration on DesignWare ARC EV7x Processors

Liliya Tazieva, Software Engineer at Synopsys, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Tazieva demonstrates simultaneous localization and mapping (SLAM) acceleration on Synopsys’ DesignWare ARC EV7x processors. SLAM creates and updates a map of an unknown environment while at the same time keeping track

Synopsys Demonstration of SLAM Acceleration on DesignWare ARC EV7x Processors Read More +

Synaptics Demonstration of Smart Video Conferencing on the Edge

Zafer Diab, Director of Product Marketing at Synaptics, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Diab demonstrates smart video conferencing on the edge in partnership with Pilot.ai. Enhancements in AI processing capabilities on edge devices are enabling a richer video conferencing experience. These capabilities

Synaptics Demonstration of Smart Video Conferencing on the Edge Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top