Development Tools for Embedded Vision
ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS
The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.
Both general-purpose and vender-specific tools
Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.
Heterogeneous software development in an integrated development environment
Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.
NVIDIA TAO Toolkit: How to Build a Data-centric Pipeline to Improve Model Performance (Part 3 of 3)
This blog post was originally published at Tenyks’ website. It is reprinted here with the permission of Tenyks. During this series, we will use Tenyks to build a data-centric pipeline to debug and fix a model trained with the NVIDIA TAO Toolkit. Part 1. We demystify the NVIDIA ecosystem and define a data-centric pipeline based
Computer Vision and AI at the Edge with a Thermal Camera Provider and a Toy Manufacturer
This blog post was originally published at Digica’s website. It is reprinted here with the permission of Digica. As the pace of artificial intelligence innovation accelerates, we’re seeing AI and computer vision go from science fiction tropes to enabling highly efficient and compelling applications. This integration is particularly potent at the edge, where devices locally
HPC Hardware Market to Grow at 13.6% CAGR to 2035
HPC systems, including supercomputers, outclass all other classes of computing in terms of calculation speed by parallelizing processing over many processors. HPC has long been an integral tool across critical industries, from facilitating engineering modeling to predicting the weather. The AI boom has intensified development in the sector, growing the capabilities of hardware technologies, including
How AI On the Edge Fuels the 7 Biggest Consumer Tech Trends of 2025
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. From more on-device AI features on your phone to the future of cars, 2025 is shaping up to be a big year Over the last two years, generative AI (GenAI) has shaken up, well, everything. Heading into
BrainChip Brings Neuromorphic Capabilities to M.2 Form Factor
Laguna Hills, Calif. – January 8th, 2025 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced the availability of its Akida™ advanced neural networking processor on the M.2 form factor, enabling a low-cost, high-speed and low-power consumption option for
Israeli Start-up Visionary.ai Powers the Revolutionary Under-display Camera on the Lenovo Yoga Slim 9i
The long-awaited camera-under-display laptop technology is being launched at CES for the first time on Lenovo Yoga Slim 9i Jan 08, 2025 – JERUSALEM, ISRAEL – Under-display cameras—fully hidden beneath laptop screens—have been an unrealized dream until now. This revolutionary technology enables Lenovo to deliver a sleek, bezel-free design while maintaining exceptional video quality, thanks
Intel Accelerates Software-defined Innovation with Whole-vehicle Approach
At CES 2025, Intel unveils new adaptive control solution, next-gen discrete graphics and AWS virtual development environment. What’s New: At CES, Intel unveiled an expanded product portfolio and new partnerships designed to accelerate automakers’ transitions to electric and software-defined vehicles (SDVs). Intel now offers a whole-vehicle platform, including high-performance compute, discrete graphics, artificial intelligence (AI), power
Customize e-con Systems’ FPGA IP Cores to Meet Unique Vision Needs
This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Developing cutting-edge vision systems requires more than just advanced hardware—it demands the ability to customize and optimize image processing to meet specific application needs. That’s why e-con Systems offers a suite of high-performance Image Signal