fbpx

Development Tools

Development Tools for Embedded Vision

ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS

The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.

Both general-purpose and vender-specific tools

Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.

Heterogeneous software development in an integrated development environment

Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.

“Temporal Event Neural Networks: A More Efficient Alternative to the Transformer,” a Presentation from BrainChip

Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit. The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip,… “Temporal Event Neural Networks: A

Read More »

How Edge Devices Can Help Mitigate the Global Environmental Cost of Generative AI

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Exploring the role of edge devices in reducing energy consumption and promoting sustainability in AI systems The economic value of generative artificial intelligence (AI) to the world is immense. Research from McKinsey estimates that generative AI could add the

Read More »

“Silicon Slip-Ups: The Ten Most Common Errors Processor Suppliers Make (Number Four Will Amaze You!),” a Presentation from BDTI

Phil Lapsley, Co-founder and Vice President of BDTI, presents the “Silicon Slip-Ups: The Ten Most Common Errors Processor Suppliers Make (Number Four Will Amaze You!)” tutorial at the May 2024 Embedded Vision Summit. For over 30 years, BDTI has provided engineering, evaluation and advisory services to processor suppliers and companies… “Silicon Slip-Ups: The Ten Most

Read More »

Upcoming Webinar Explores AI for Audio

On June 25, 2024 at 2:00 pm PT (5:00 pm ET), Alliance Member company MACSO Technologies will deliver the free webinar “AI for Audio.” From the event page: Come join the MACSO team and our partner community to dive into the intricacies of AI for audio. MACSO has a wealth of experience to share in

Read More »

“How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision,” a Presentation from Axelera AI

Bram Verhoef, Head of Machine Learning at Axelera AI, presents the “How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision” tutorial at the May 2024 Embedded Vision Summit. As artificial intelligence inference transitions from cloud environments to edge locations, computer vision applications achieve heightened responsiveness, reliability… “How Axelera AI Uses Digital

Read More »

Power Cloud-native Microservices at the Edge with NVIDIA JetPack 6.0, Now GA

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. NVIDIA JetPack SDK powers NVIDIA Jetson modules, offering a comprehensive solution for building end-to-end accelerated AI applications. JetPack 6 expands the Jetson platform’s flexibility and scalability with microservices and a host of new features. It’s the most downloaded

Read More »

“How Arm’s Machine Learning Solution Enables Vision Transformers at the Edge,” a Presentation from Arm

Stephen Su, Senior Segment Marketing Manager at Arm, presents the “How Arm’s Machine Learning Solution Enables Vision Transformers at the Edge” tutorial at the May 2024 Embedded Vision Summit. AI at the edge has been transforming over the last few years, with newer use cases running more efficiently and securely.… “How Arm’s Machine Learning Solution

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top