Development Tools for Embedded Vision
ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS
The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.
Both general-purpose and vender-specific tools
Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.
Heterogeneous software development in an integrated development environment
Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.
Benchmarking Camera Performance on Your Workstation with NVIDIA Isaac Sim
This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Robots are typically equipped with cameras. When designing a digital twin simulation, it’s important to replicate its performance in a simulated environment accurately. However, to make sure the simulation runs smoothly, it’s crucial to check the performance
Quantization of Convolutional Neural Networks: Model Quantization
See “From Theory to Practice: Quantizing Convolutional Neural Networks for Practical Deployment” for the previous article in this series. Significant progress in Convolutional Neural Networks (CNNs) has focused on enhancing model complexity while managing computational demands. Key advancements include efficient architectures like MobileNet1, SqueezeNet2, ShuffleNet3, and DenseNet4, which prioritize compute and memory efficiency. Further innovations
Visidon at Mobile World Congress 2024
Visidon will exhibit at Mobile World Congress 2024, the world’s largest mobile exhibition, in Barcelona, Spain, from February 26 to 29. Visidon’s booth will showcase its technological advancements since last year. You can find us at the Finland Pavilion in Hall 5, Booth 5J45. MWC Barcelona is the largest and most influential event for the
CES 2024 Showed that the Future of Cars will be Defined by AI
IDTechEx‘s new report, “Future Automotive Technologies 2024-2034: Applications, Megatrends, Forecasts“, highlights the biggest changes coming to cars over the next ten years. Electrification will change what powers cars, and automation will change how they are driven, but one of the biggest opportunities is connectivity and software definition, which will change how cars are monetized. This
Generative AI in 2024: The 6 Most Important Consumer Tech Trends
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Qualcomm executives reveal key trends in AI, consumer technology and more for the future Not that long ago, the banana-peel-and-beer-fueled DeLorean in “Back to the Future” was presented as comedy. Yet today, 10% of cars are electric-powered.1
Adapting Strategies: Industrial Machine Vision’s Response to Short-term Challenges
This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. In 2023, the total machine vision market generated $6.9 billion in revenue. When will growth return? OUTLINE Industrial machine vision: a $7.8 billion market in 2029 compared to $6.9 billion in 2023.
Do You Know When You Don’t Know Something?
This blog post was originally published at Digica’s website. It is reprinted here with the permission of Digica. We all have that one uncle in the family who knows the answer to every question. Even if that answer is often wrong. Let’s call him Uncle Bob. From politics to science, he confidently shares his opinion
How Does LiDAR Work In Detail?
This blog post was originally published at Outsight’s website. It is reprinted here with the permission of Outsight. 3D LiDAR is a complex technology that enables unprecedented spatial intelligence. Many engineering choices are possible when building a new device. This article is a continuation of the basic introduction to LiDAR technology that was provided in