FUNCTIONS

Gaze Tracking Using CogniMem Technologies’ CM1K and a Freescale i.MX53

This demonstration, which pairs a Freescale i.MX Quick Start board and CogniMem Technologies CM1K evaluation module, showcases how to use your eyes (specifically where you are looking at any particular point in time) as a mouse. Translating where a customer is looking to actions on a screen, and using gaze tracking to electronically control objects […]

Gaze Tracking Using CogniMem Technologies’ CM1K and a Freescale i.MX53 Read More +

Adding Precise Finger Gesture Recognition Capabilities to the Microsoft Kinect

CogniMem’s Chris McCormick, application engineer, demonstrates how the addition of general-purpose and scalable pattern recognition can be used to bring enhanced gesture control to the Microsoft Kinect. Envisioned applications include augmenting or eliminating the TV remote control, using American Sign Language for direct text translation, and expanding the game-playing experience. To process even more gestures

Adding Precise Finger Gesture Recognition Capabilities to the Microsoft Kinect Read More +

“Keeping Brick and Mortar Relevant, A Look Inside Retail Analytics,” A Presentation from Prism Skylabs

Doug Johnston, Founder and Vice President of Technology at Prism Skylabs, delivers the presentation "Keeping Brick and Mortar Relevant: A Look Inside Prism Skylabs and Retail Analytics" at the December 2014 Embedded Vision Alliance Member Meeting. Doug explains how his firm is using vision to provide retailers with actionable intelligence based on consumer behavior.

“Keeping Brick and Mortar Relevant, A Look Inside Retail Analytics,” A Presentation from Prism Skylabs Read More +

Vision in Wearable Devices: Enhanced and Expanded Application and Function Choices

A version of this article was originally published at EE Times' Embedded.com Design Line. It is reprinted here with the permission of EE Times. Thanks to the emergence of increasingly capable and cost-effective processors, image sensors, memories and other semiconductor devices, along with robust algorithms, it's now practical to incorporate computer vision into a wide

Vision in Wearable Devices: Enhanced and Expanded Application and Function Choices Read More +

idlogo2

Practical Computer Vision Enables Digital Signage with Audience Perception

This article was originally published at Information Display Magazine. It is reprinted here with the permission of the Society of Information Display. Signs that see and understand the actions and characteristics of individuals in front of them can deliver numerous benefits to advertisers and viewers alike.  Such capabilities were once only practical in research labs

Practical Computer Vision Enables Digital Signage with Audience Perception Read More +

Figure2

Smart In-Vehicle Cameras Increase Driver and Passenger Safety

This article was originally published at John Day's Automotive Electronics News. It is reprinted here with the permission of JHDay Communications. Cameras located in a vehicle's interior, coupled with cost-effective and power-efficient processors, can deliver abundant benefits to drivers and passengers alike. By Brian Dipert Editor-in-Chief Embedded Vision Alliance Tom Wilson Vice President, Business Development

Smart In-Vehicle Cameras Increase Driver and Passenger Safety Read More +

nvidia

Accelerate Machine Learning with the cuDNN Deep Neural Network Library

This article was originally published at NVIDIA's developer blog. It is reprinted here with the permission of NVIDIA. By Larry Brown Solution Architect, NVIDIA Machine Learning (ML) has its origins in the field of Artificial Intelligence, which started out decades ago with the lofty goals of creating a computer that could do any work a

Accelerate Machine Learning with the cuDNN Deep Neural Network Library Read More +

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org

Ori Inbar, co-founder and CEO of AugmentedReality.org, presents the "How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things" tutorial at the May 2014 Embedded Vision Summit. In this talk, Inbar explains how augmented reality, which relies heavily on embedded vision, is transitioning from a

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, PercepTonic

Goksel Dedeoglu, Ph.D., Founder and Lab Director of PercepTonic, presents the "Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It" tutorial at the May 2014 Embedded Vision Summit. This tutorial is intended for technical audiences interested in learning about the Lucas-Kanade (LK) tracker, also known as the Kanade-Lucas-Tomasi (KLT)

May 2014 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, PercepTonic Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality,” Simon Morris, CogniVue

Simon Morris, CEO of CogniVue, presents the "Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality" tutorial at the May 2014 Embedded Vision Summit. Augmented reality (AR) applications are based on accurately computing a camera's 6 degrees of freedom (6DOF) position in 3-dimensional space, also known as its "pose". In vision-based approaches to AR,

May 2014 Embedded Vision Summit Technical Presentation: “Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality,” Simon Morris, CogniVue Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top