Robotics Applications for Embedded Vision
Macnica Americas Announces Agreement with Sony to Distribute Sony Image Sensors
Will Enable Next-Gen Imaging Solutions Across AI, Robotics, Healthcare, and Beyond SOLANA BEACH, CALIFORNIA, December 3, 2024 – Macnica Americas today announced an expansion on its strategic agreement with Sony to now distribute its cutting-edge image sensors throughout North, Central, and South America. This collaboration further enhances our extensive imaging and vision solutions offering that
NVIDIA Advances Robot Learning and Humanoid Development With New AI and Simulation Tools
This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. New Project GR00T workflows and AI world model development technologies to accelerate robot dexterity, control, manipulation and mobility. Robotics developers can greatly accelerate their work on AI-enabled robots, including humanoids, using new AI and simulation tools and
Dietmar Ley Elected Chairman of the VDMA Robotics + Automation Association
Ley will hold the honorary position for three years and emphasizes the role of the committee as a strong voice for European robotics and automation suppliers Ahrensburg, November 22, 2024 – Dr. Dietmar Ley, CEO of Basler AG, has been elected as the new Chairman of the VDMA Robotics + Automation Association. Ley has been
How NVIDIA Jetson AGX Orin Helps Unlock the Power of Surround-view Camera Solutions
This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Autonomous vehicles, such as warehouse robots, rely on precise maneuvering. NVIDIA Jetson AGX Orin™-powered surround-view cameras provide a perfectly synchronized solution, allowing these robots to move freely within designated areas without requiring intensive manual intervention.
Synthetic Data is Revolutionizing Sensor Tech: Real Results from Virtual Worlds
This blog post was originally published at Geisel Software’s website. It is reprinted here with the permission of Geisel Software. Imagine you’re a developer on your first day at a new job. You’re handed a state-of-the-art sensor designed to capture data for an autonomous vehicle. The excitement quickly turns to anxiety as you realize the
Robotic Independence in Logistics
Mobile robots in logistics can be seen to work throughout an entire supply chain from manufacturing processes, to reaching consumers and shops. IDTechEx‘s report, “Mobile Robotics in Logistics, Warehousing and Delivery 2024-2044“, provides an in-depth overview of the role of new robot technologies at each stage of the chain, highlighting mobility, flexibility, and scalability as
Computer Vision Integration in Robotic Applications: Real-world Insights
This blog post was originally published at Geisel Software’s website. It is reprinted here with the permission of Geisel Software. Imagine a world where computer vision integration allows robots not only ‘do’ but ‘see’ and ‘understand’. Welcome to the Fourth Industrial Revolution (Industry 4.0)! Here, the fusion of artificial intelligence and industrial robotics is sparking
Collaborative Robots 2025-2045: Technologies, Players, and Markets
For more information, visit https://www.idtechex.com/en/research-report/collaborative-robots-2025-2045-technologies-players-and-markets/1046. Collaborative robots, a type of lightweight and slow-moving robot designed to work next to human operators without a physical fence, has gained significant momentum thanks to their flexibility and the initiatives of bringing humans back to factories, industry 5.0, made in China 2025, and many announcements from leading companies of
Robotics Industry to Recover After 2024 Slowdown: Revenue to Jump by 58% by End of Decade
The global demand for robotics has been shifting back and forth over the past years, with the pandemic-driven sales boom followed by stagnation in 2024. However, after a year of stagnation, the entire industry is set to witness a significant recovery, driving double-digit revenue growth by the end of a decade. According to data presented
D3 Embedded Introduces Camera Modules Based on Valens Semiconductor’s VA7000 MIPI A-PHY Chipsets
The integration of MIPI A-PHY into DesignCore® Series Cameras will accelerate time-to-market for customers developing performance-critical products for robotics, industrial vehicles, and other embedded vision applications. Rochester, NY – October 8th, 2024 – D3 Embedded, a global leader in embedded vision systems design and manufacturing, today announced that it has partnered with Valens Semiconductor, a
Europe Has the Fastest-Growing Robotics Industry; Revenue to Surge by 68% and Hit $28.8 Billion by 2029
Although the United States and China have been in a tech race for decades, battling for supremacy in critical areas of technology that will shape the future of the global economy, including artificial intelligence and robotics, the two countries lose the race with Europe when it comes to robotics sector growth. According to data presented
Waveye Demonstration of Ultra-high Resolution LIR Imaging Technology
Levon Budagyan, CEO of Waveye, demonstrates the company’s latest edge AI and vision technologies and products at the September 2024 Edge AI and Vision Alliance Forum. Specifically, Budagyan demonstrates his company’s ultra-high resolution LIR imaging technology, focused on key robotics applications: Robust, privacy-preserving human detection for robots 4D microwave imaging in high resolution for indoor
“Future Radar Technologies and Applications,” a Presentation from IDTechEx
James Jeffs, Senior Technology Analyst at IDTechEx, presents the “Future Radar Technologies and Applications” tutorial at the May 2024 Embedded Vision Summit. Radar has value in a wide range of industries that are embracing automation, from delivery drones to agriculture, each requiring different performance attributes. Autonomous vehicles are perhaps one… “Future Radar Technologies and Applications,”
Using Generative AI to Enable Robots to Reason and Act with ReMEmbR
This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Vision-language models (VLMs) combine the powerful language understanding of foundational LLMs with the vision capabilities of vision transformers (ViTs) by projecting text and images into the same embedding space. They can take unstructured multimodal data, reason over
“Building Meaningful Products Using Complex Sensor Systems,” a Presentation from DEKA Research & Development
Dirk van der Merwe, Autonomous Robotics Lead at DEKA Research & Development, presents the “Building Meaningful Products Using Complex Sensor Systems” tutorial at the May 2024 Embedded Vision Summit. Most complex sensor systems begin with a simple goal—ensuring safety and efficiency. Whether it’s avoiding collisions between vehicles or predicting future… “Building Meaningful Products Using Complex
“Better Farming through Embedded AI,” a Presentation from Blue River Technology
Chris Padwick, Director of Computer Vision Machine Learning at Blue River Technology, presents the “Better Farming through Embedded AI” tutorial at the May 2024 Embedded Vision Summit. Blue River Technology, a subsidiary of John Deere, uses computer vision and deep learning to build intelligent machines that help farmers grow more… “Better Farming through Embedded AI,”
Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor
Shay Harel, field application engineer at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Harel demonstrates the capabilities of his company’s M4.51 sensor module using a simple Python script that leverages Inuitive’s API for real-time object detection. The M4.51 sensor module, based on the
Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm
Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates visual servoing in a robotic arm enabled by the MAX78000 AI microcontroller. The MAX78000 is an Arm-M4F microcontroller with a hardware-based convolutional