fbpx
Now available—the Embedded Vision Summit On-Demand Edition! Gain valuable computer vision and edge AI insights and know-how from the experts at the 2021 Summit.

Hayk Martiros, Head of Autonomy at Skydio, presents the “Tackling Extreme Visual Conditions for Autonomous UAVs In the Wild” tutorial at the September 2020 Embedded Vision Summit.

Skydio ships autonomous robots that are flown at scale in complex, unknown environments every day to capture incredible video, automate dangerous inspections and save lives of first responders. They must make decisions at high speed using just their onboard cameras and algorithms running on low-cost hardware. The company has invested five years of R&D into handling extreme visual scenarios not typically considered by academia nor encountered by cars, ground-based robots or AR applications.

Drones are commonly used in scenes with few or no semantic priors on the environment and must deftly navigate in the presence of thin objects, extreme lighting, camera artifacts, motion blur, textureless surfaces, vibrations, dirt, smudges and fog. Because photometric signals are not consistent, these challenges are daunting for both classical vision approaches and unsupervised learning. And there is no ground truth for direct supervision. Martiros takes a detailed look at these issues and how his company tackled them to push autonomous flight into production.

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 North California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top