LETTER FROM THE EDITOR |
Dear Colleague, Every year the Edge AI and Vision Alliance surveys developers to understand their requirements, preferences and pain points around computer vision and perceptual AI. This survey is now in its 12th year because of people like you, who contribute their real-world insights. We share the results from the survey at Edge AI and Vision Alliance events and in white papers and presentations made available throughout the year on the Alliance website. The survey is intended for people who are developing, or have recently developed, systems or applications incorporating computer vision or other types of perceptual (sensor-based) AI (the Alliance reserves the right to determine eligibility). In thanks for your time completing the survey—it takes about 20 minutes—we’ll provide you with:
Click here to take the survey now! Brian Dipert
|
EDGE AI PROCESSING ADVANCES |
ONNX and Python to C++: State-of-the-art Graph Compilation Quadric’s Chimera general-purpose neural processor executes complete AI/ML graphs—all layers, including pre- and post-processing functions traditionally run on separate DSP processors. To enable this, the Chimera Graph Compiler processes and optimizes a combination of NN graphs, Python code and C++ kernels into a single optimized executable. In this 2025 Embedded Vision Summit talk, Nigel Drego, Co-founder and Chief Technology Officer at Quadric, presents an overview of Quadric’s Chimera Graph Compiler, including compilation of ONNX graphs and Python code into C++ representations targeting Chimera. Drego shows examples of fully automated graph conversion and explains the custom operator creation flow. He shows how this advanced toolchain addresses the challenges faced by developers when implementing the full signal chain. Instead of piecemeal compilation of signal conditioning, NN graphs and post-processing via three separate SDKs—requiring the developer to integrate and tune the final code—the Chimera tools merge and optimize these closely related computations using multi-operation fusion, yielding greater programmer productivity and superior results. |
Scaling i.MX Applications Processors’ Native Edge AI with Discrete AI Accelerators The integration of discrete AI accelerators with edge processors is poised to revolutionize the capabilities of edge computing, enabling real-time, low-latency and energy-efficient AI applications. As the computational power required for complex AI workloads increases, edge computing applications processors can benefit from discrete AI accelerators. AI accelerators—specialized hardware designed to accelerate machine learning tasks—offload intensive computations, significantly improving processing speed, efficiency and energy consumption. In this popular 2025 Embedded Vision Summit presentation, Ali Osman Ors, Director of AI ML Strategy and Technologies for Edge Processing at NXP Semiconductors, explores the role of discrete AI accelerators in extending the capabilities of edge processors, discussing their ability to enhance real-time processing, reduce latency and enable autonomous decision-making at the edge. He also examines the integration of AI accelerators with existing edge architectures, the challenges of implementation and considerations such as power management, cost and security.
|
ADDRESSING EDGE AI STORAGE AND BANDWIDTH CHALLENGES |
Introduction to Data Types for AI: Trade-offs and Trends The increasing complexity of AI models has led to a growing need for efficient data storage and processing. One critical way to gain efficiency is using smaller and simpler data types. In this 2025 Embedded Vision Summit presentation, Joep Boonstra, Synopsys Scientist at Synopsys, explores the trade-offs in data types for AI. He introduces the most commonly used data types, including compact integer and floating-point formats, and highlights their advantages and limitations, as well as their impact on model accuracy and the complexity of the quantization process. Boonstra examines the main trends in this space, including emerging techniques such as microscaling, and considers the benefits of advanced compression techniques. He concludes by summarizing the key considerations for AI data type selection, including maximizing system throughput and balancing compute and memory bandwidth. |
Addressing Evolving AI Model Challenges Through Memory and Storage In the fast-changing world of artificial intelligence, the industry is deploying more AI compute at the edge. But the growing diversity and data footprint of transformers and models such as large language models and large multimodal models puts a spotlight on memory performance and data storage capacity as key bottlenecks. Enabling the full potential of AI in industries such as manufacturing, automotive, robotics and transportation will require us to find efficient ways to deploy this new generation of complex models. In this 2025 Embedded Vision Summit talk, Wil Florentino, Senior Segment Marketing Manager at Micron, explores how memory and storage are responding to this need and solving complex issues in the AI market. He examines the storage capacity and memory bandwidth requirements of edge AI use cases ranging from tiny devices with severe cost and power constraints to edge servers, and he explains how new memory technologies such as LPDDR5, LPCAMM2 and multi-port SSDs are helping system developers to meet these challenges.
|
UPCOMING INDUSTRY EVENTS |
Infrared Imaging: Technologies, Trends, Opportunities and Forecasts – Yole Group Webinar: September 23, 2025, 9:00 am PT Embedded Vision Summit: May 11-13, 2026, Santa Clara, California
|
FEATURED NEWS |
Arm Neural Technology Delivers Smarter, Sharper, More Efficient Mobile Graphics for Developers SiMa.ai’s Next-Generation Platform for Physical AI is Now in Production NVIDIA Opens Portals to the World of Robotics with New Omniverse Libraries, Cosmos Physical AI Models and AI Computing Infrastructure An Upcoming Webinar from e-Con Systems Explores Ethernet Cameras for AI-driven Vision Systems BrainChip Launches the Akida Cloud for Instant Access to Latest Akida Neuromorphic Technology
|
EDGE AI AND VISION PRODUCT OF THE YEAR WINNER SHOWCASE |
Qualcomm Snapdragon 8 Elite Platform (Best Edge AI Processor) Qualcomm’s Snapdragon 8 Elite Platform is the 2025 Edge AI and Vision Product of the Year Award Winner in the Edge AI Processors category. This platform significantly enhances on-device experiences through remarkable processing power, groundbreaking AI advancements, and various mobile innovations. The Snapdragon 8 Elite includes a new custom-built Qualcomm Oryon CPU which delivers impressive speeds and efficiency to enhance every interaction. It provides a 45% performance boost, 44% greater power efficiency, and includes the mobile industry’s largest shared data cache. Additionally, Qualcomm’s Adreno GPU, with its newly designed architecture, achieves a 40% increase in performance and a 40% improvement in efficiency. Overall, users can expect a 27% reduction in power consumption. The platform enhances user experiences with on-device AI, showcased through the Qualcomm AI Engine, which incorporates multimodal generative AI and personalized support. This AI Engine utilizes a variety of models, including large multimodal models (LMMs), large language models (LLMs), and visual language models (LVMs), while supporting the world’s largest generative AI model ecosystem. It also features Qualcomm’s 45% faster Hexagon NPU, which provides an impressive 45% increase in performance per watt, driving AI capabilities to new levels. Moreover, Qualcomm’s new AI Image Signal Processor (ISP) works in tandem with the Hexagon NPU to enhance real-time image capture. Connectivity options include advanced AI-driven 5G and Wi-Fi 7 capabilities, facilitating seamless entertainment and productivity on the go. Please see here for more information on Qualcomm’s Snapdragon 8 Elite Platform. The Edge AI and Vision Product of the Year Awards celebrate the innovation of the industry’s leading companies that are developing and enabling the next generation of edge AI and computer vision products. Winning a Product of the Year award recognizes a company’s leadership in edge AI and computer vision as evaluated by independent industry experts. |