Robotics Applications for Embedded Vision
Ready for Air: Welcome to a New Dawn for Aerial Autonomy
This blog post was originally published at Opteran Technologies’ website. It is reprinted here with the permission of Opteran Technologies. Opteran enables robust, fast, GPS free, verifiable autonomy for aerial systems, at previously unimaginable size, weight (~30g), power (~3W) and hardware costs using consumer 2D cameras Today we’re happy to share a first glance at our
Inuitive Demonstration of Its Multi-core Processor For 3D Imaging, Deep Learning and Computer Vision
Dor Zepeniuk, Chief Technology Officer and Vice President of Products at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Zepeniuk demonstrates the company’s NU4000, a multi-core processor for 3D imaging, deep learning and computer vision. Zepeniuk demos the diverse edge computing and other capabilities
eYs3D Microelectronics Demonstration of Stereo Vision for Robotic Automation and Depth Sensor Fusion
James Wang, Technical General Manager at eYs3D Microelectronics, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Wang demonstrates stereo vision for robotic automation and depth sensor fusion. Depth-sensing technology is now being widely adopted commercially in various consumer and industrial products. It’s commonly recognized that
ADLINK Releases its First SMARC Module Based on Qualcomm QRB5165 Enabling High Performance Robots and Drones at Low Power
Integrated IoT technologies provide on-device AI capabilities at the edge Summary: The LEC-RB5 SMARC is a high-performance module, built with the Qualcomm® QRB5165 processor, allowing on-device AI and 5G connectivity capabilities for consumer, enterprise and industrial robots. It features a high performance NPU, Octa-Core (8x Arm Coretex-A77 cores) CPU, low power consumption and support for
Synopsys Demonstration of SLAM Acceleration on DesignWare ARC EV7x Processors
Liliya Tazieva, Software Engineer at Synopsys, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Tazieva demonstrates simultaneous localization and mapping (SLAM) acceleration on Synopsys’ DesignWare ARC EV7x processors. SLAM creates and updates a map of an unknown environment while at the same time keeping track
D3 Engineering Demonstration of a Synchronized 2D/3D Time-of-Flight Multi-camera Application on NVIDIA’s Jetson AGX Xavier
Jason Enslin, Product Manager for NVIDIA Jetson and Cameras at NVIDIA partner D3 Engineering, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Enslin demonstrates a synchronized 2D and 3D time-of-flight multi-camera application on NVIDIA’s Jetson AGX Xavier. With the growing need for video, distance measurement
Connect Tech Demonstration of Integrating Vision Sensors with NVIDIA’s Jetson Edge Applications
Patrick Dietrich, Chief Technology Officer of NVIDIA partner Connect Tech, demonstrates the company’s latest edge AI and vision technologies and products at the 2021 Embedded Vision Summit. Specifically, Dietrich demonstrates how to integrating vision sensors with NVIDIA’s Jetson edge applications. Since the inception of NVIDIA’s Jetson platform, the capabilities of edge AI and vision-enabled systems
NVIDIA Announces Omniverse Replicator Synthetic-Data-Generation Engine for Training AIs
First Omniverse Replicator-Based Applications, DRIVE Sim and Isaac Sim, Accelerate Development of Autonomous Vehicles and Robots Tuesday, November 9, 2021—GTC—NVIDIA today announced NVIDIA Omniverse Replicator, a powerful synthetic-data-generation engine that produces physically simulated synthetic data for training deep neural networks. In its first implementations of the engine, the company introduced two applications for generating synthetic
NVIDIA Sets Path for Future of Edge AI and Autonomous Machines With New Jetson AGX Orin Robotics Computer
NVIDIA Ampere Architecture Comes to Jetson, Delivering up to 200 TOPS and 6x Performance Gain Tuesday, November 9, 2021—GTC—NVIDIA today introduced NVIDIA Jetson AGX Orin™, the world’s smallest, most powerful and energy-efficient AI supercomputer for robotics, autonomous machines, medical devices and other forms of embedded computing at the edge. Built on the NVIDIA Ampere architecture,
May 2021 Embedded Vision Summit Slides
The Embedded Vision Summit was held online on May 25-28, 2021, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in PDF form. To… May 2021 Embedded Vision Summit
“Optimizing ML Systems for Real-World Deployment,” a Presentation from iRobot
Danielle Dean, Technical Director of Machine Learning at iRobot, presents the “Optimizing ML Systems for Real-World Deployment” tutorial at the May 2021 Embedded Vision Summit. In the real world, machine learning models are components of a broader software application or system. In this talk, Dean explores the importance of optimizing the system as a whole–not
“Perspectives on AI Beyond Pattern Recognition,” an Alliance Interview with Pieter Abbeel
Professor Pieter Abbeel, Director of the Berkeley Robot Learning Lab and Co-Director of the Berkeley Artificial Intelligence (BAIR) Lab, talks with Jeff Bier, Founder of the Edge AI and Vision Alliance, for the “Perspectives on AI Beyond Pattern Recognition” interview at the May 2021 Embedded Vision Summit. See here for Abbeel’s keynote at the Summit,
“From Inference to Action: AI Beyond Pattern Recognition,” a Keynote Presentation from Pieter Abbeel
Professor Pieter Abbeel, Director of the Berkeley Robot Learning Lab and Co-Director of the Berkeley Artificial Intelligence (BAIR) Lab, presents the “From Inference to Action: AI Beyond Pattern Recognition” tutorial at the May 2021 Embedded Vision Summit. Pattern recognition—such as that used in image recognition, speech recognition and machine translation—has been the primary focus of
Unity Announces Support for ROS 2
ROS 2 Demo Showcases Autonomous Mobile Robot in Unity Simulation August 11, 2021 08:00 AM Eastern Daylight Time–SAN FRANCISCO–(BUSINESS WIRE)–Unity, the world’s leading platform for creating and operating real-time 3D (RT3D) content, today announced support for ROS 2 – the open-source robotics middleware suite from Open Robotics. Building on its support of ROS earlier this
eCapture from eYs3D Microelectronics Launches Smallest Form Factor 3D Stereo Depth Sensing Camera for Robotics and Object Tracking
eCapture™️ Cameras leverage eYs3D stereo vision technology and high-performance vision processor in this new line of devices SANTA CLARA, CA / ACCESSWIRE / August 23, 2021 / eCapture, a brand focused on developing technology for the growing computer vision market, today is introducing the smallest form factor stereoscopic 3D depth sensing camera. The new LifeSense™
Qualcomm Flight RB5 5G Platform — The World’s First 5G- and AI-enabled Drone Platform
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Qualcomm Flight is now ready to transform industries with premium computing, camera, computer vision, and connectivity features. Qualcomm Technologies is a leader in inventing breakthroughs that power robotics and drone platforms. We help people connect and communicate
“5G and AI Transforming the Next Generation of Robotics,” a Presentation from Qualcomm
Kishore Chakravadhanula, Staff Product Manager at Qualcomm, presents the “5G and AI Transforming the Next Generation of Robotics” tutorial at the May 2021 Embedded Vision Summit. Bringing together the transformative power of 5G and AI technologies is essential to driving the next generation of high-compute, low-power robots and drones for consumer, enterprise and industrial sectors.
Robotic Industry: +$7.4B to the Industrial Sensor Market
This market research report was originally published at Yole Développement’s website. It is reprinted here with the permission of Yole Développement. OUTLINE: Market forecasts: Sensors for goods transportation will reach a US$1.3 billion market in 2026. Yole announces significant market uptake until the end of the decade reaching US$7.4 billion in 2031. Sensors for robotic