Edge AI and Vision Insights (formerly Embedded Vision Insights) is our bi-weekly newsletter, delivering the latest technology, applications, markets, and trends in edge AI and vision right to your inbox. To receive Insights, please register via the “Subscribe to the Edge AI and Vision Insights Newsletter” section in the right column of this page.
You can also sign up for Insights as part of the full website registration process, which also gives you access to content that lives behind the site’s registration wall. (Not to worry — registration is free and easy.)
(Note that prior to February 2020, Insights was known as Embedded Vision Insights, and the Alliance was the Embedded Vision Alliance. So if you’re puzzled by the names below, that’s what’s going on.)
LETTER FROM THE EDITOR Dear Colleague, The 2024 Embedded Vision Summit Call for Presentation Proposals is still open, but not for much longer! I invite you to share your expertise.
LETTER FROM THE EDITOR Dear Colleague, I’m excited to announce that registration is now open for the 2024 Embedded Vision Summit, coming up May 21-23 in Santa Clara, California! It’s
LETTER FROM THE EDITOR Dear Colleague, I’m excited to announce the inaugural AI Innovation Awards, brought to you by the Edge AI and Vision Alliance. The awards celebrate groundbreaking end
TOOLSETS FOR EFFICIENT HETEROGENEOUS PROCESSING A New, Open-standards-based, Open-source Programming Model for All Accelerators As demand for AI grows, developers are attempting to squeeze more and more performance from accelerators.
MULTIMODAL PERCEPTION Frontiers in Perceptual AI: First-person Video and Multimodal Perception First-person or “egocentric” perception requires understanding the video and multimodal data that streams from wearable cameras and other sensors.
LETTER FROM THE EDITOR Dear Colleague, You probably won’t be surprised that 84% of vision-based product developers are using DNNs. But did you know that 80+% are using non-neural network
SOFTWARE TOOLSETS Visual Anomaly Detection with FOMO-AD Virtually all computer vision machine learning models involve classification—for example, “how many humans are in the frame?” To train such a model, you
LOW-POWER SENSING AND PROCESSING Battery-powered Edge AI Sensing: A Case Study Implementing Low-power, Always-on Capability The trend of pushing AI/ML capabilities to the edge brings design challenges around the need
DEVELOPMENT TOOL ADVANCEMENTS Deploy Your Embedded Vision Solution on Any Processor Vision-based product developers have a vast array of processors to choose from. Unfortunately, each processor has its own unique
VISION PROCESSING AND ACCELERATION M1 NPU Delivers Flexibility, Accuracy, Efficiency and Performance For a growing range of applications, deploying AI in the real world means running various models at the