fbpx

Development Tools

Development Tools for Embedded Vision

ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS

The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.

Both general-purpose and vender-specific tools

Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.

Heterogeneous software development in an integrated development environment

Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.

AiM Future Brings GenAI Applications to Mainstream Consumer Devices

Seoul, Korea, and San Jose, CA – May 15, 2024 – AiM Future, a leading provider of concurrent multimodal inference accelerators for edge and endpoint devices, has just announced the launch of its next-generation Generative AI Architecture, “GAIA,” and Synabro software development kit. These GAIA-based accelerators are designed to enable energy-efficient transformers and large language

Read More »

The SHD Group, in Collaboration with the Edge AI and Vision Alliance™, to Release a Complimentary Edge AI Processor and Ecosystem Report

SAN JOSE, Calif. – May 14, 2024 – The SHD Group, a leading strategic marketing, research, and business development firm, today announced the creation of an edge AI report that will be a resource for both product developers and ecosystem providers. This guide will detail processors integrating AI accelerators, standalone acceleration chips, accelerator IP, software,

Read More »

Morpho and DOOGEE Announce Strategic Cooperation for Smartphone Imaging Technology

Tokyo, Japan – May 14th, 2024 – Morpho, Inc. (hereinafter “Morpho”) , a global leader in image processing and imaging AI solutions, announced today that it has forged a strategic partnership with Shenzhen DOOGEE Hengtong Technology CO., LTD (hereinafter “DOOGEE”), a leading global manufacturer of mobile phone terminals. Through this partnership, Mopho provided image processing

Read More »

Fully Sharded Data Parallelism (FSDP)

This blog post was originally published at CLIKA’s website. It is reprinted here with the permission of CLIKA. In this blog we will explore Fully Sharded Data Parallelism (FSDP), which is a technique that allows for the training of large Neural Network models in a distributed manner efficiently. We’ll examine FSDP from a bird’s eye

Read More »

Avnet to Exhibit at the 2024 Embedded Vision Summit

05/09/2024 – PHOENIX – Avnet’s exhibit plans for the 2024 Embedded Vision Summit include new development kits supporting AI applications. The summit is the premier event for practical, deployable computer vision and edge AI, for product creators who want to bring visual intelligence to products. This year’s Summit will be May 21-23, in Santa Clara, California. This

Read More »

Beyond Smart: The Rise of Generative AI Smartphones

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. From live translations to personalized content management — the new era of mobile intelligence It’s 2024, and generative artificial intelligence (AI) is finally in people’s hands. Literally. This year’s early slate of flagship smartphone releases is a

Read More »

Lattice to Showcase Advanced Edge AI Solutions at Embedded Vision Summit 2024

May 08, 2024 04:00 PM Eastern Daylight Time–HILLSBORO, Ore.–(BUSINESS WIRE)–Lattice Semiconductor (NASDAQ: LSCC), the low power programmable leader, today announced that it will showcase its latest FPGA technology at Embedded Vision Summit 2024. The Lattice booth will feature industry-leading low power, small form factor FPGAs and application-specific solutions enabling advanced embedded vision, artificial intelligence, and

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top