fbpx

Today is an important step for Opteran Technologies as we announce the Opteran Development Kit version 2 (ODK-2) and begin on the road to commercialising what we believe will be a radically different approach to artificial intelligence. As you may have seen from a previous post by our Chief Scientific Officer, James Marshall, we are engineering AI very differently using the concept of ‘Natural Intelligence.’ Inspired by research into insect brains that James and I have been conducting, our goal is to translate our understanding of insect brains into a technology platform that will enable all manner of machines to see, sense, navigate and decide.

The ODK-2 is an opportunity for select partners to get a first sense of why borrowing from nature is a far more efficient and powerful way to enable autonomy in machines. This kit integrates our Opteran See technology to enable drones, robots and autonomous vehicles to take panoramic views of the world around them, providing stabilised visuals just like an insect’s head would do. Our Opteran Sense technology then calculates optic flow across the entire field of vision, also encompassing collision signals to determine whether or not the device is approaching obstacles and needs to take course-corrective action, as well as accurately estimating the state (i.e. 3D velocity) of the device.

Why did we take this approach? It solves the biggest problems with existing AI solutions for autonomous navigation, namely size, cost and robustness. Existing AI requires vast amounts of information, often processed in datacentres, so that it can learn and make decisions, making it too expensive to run in low-cost devices and not sufficiently robust when it comes to autonomy.

Insects solve these problems very differently, requiring much less data to navigate, avoid obstacles and interpret their environment. This approach means our Dev Kit is an incredibly lightweight package weighing only 30g, drawing a few watts of power and 4x5x1cm in size. Everything needed is contained in the hardware, so there is no requirement for storage in a datacentre, because there’s no need for extensive pre-training or pattern matching. There’s also no need for lots of sensors or a network, as it does not need wireless interfaces or ‘always-on’ connections to communicate with datacentres. It can be plugged in with a USB power cable or similar interfaces, so it can be quickly up and running deployed at the edge; and with an incredibly robust signal it ensures reliability.

Consequently, developers of autonomous systems have a completely dependable, standalone AI platform for the flight control for drones or the movement control for ground-based robots – basically anything that needs to react to its environment to evaluate how the world is moving around it. So, if you have a human pilot or driver moving something around Opteran’s AI technology can help control the vehicle in a safe, simple way, without your pilot requiring advanced skills.

For instance, if a drone were operating on an oil rig, it might be blown off course by a gust of wind, but the motion sensors in the Dev Kit would see the deviation and automatically correct it allowing the operator to concentrate on manoeuvring the drone to where it needs to go. In this scenario, our collision detection capability could also react to keep the drone safe without the operator having to step in and equally, it could override the operator if he or she were going to do something dangerous with the drone. As we build out the use cases for our technology there will be multiple applications in the sky, over or underground and underwater, of which this is just one example. Our mid-term goal is to move very quickly to full on-board autonomy that acts as a safety framework around a device, while simplifying control for the operator.

Today, we are looking to identify a select group of initial partners, who might be interested in evaluating our technology and want to understand how it might integrate with applications and hardware seeking to achieve greater autonomy. The current Dev Kit is available now and shortly we will be providing a firmware upgrade to include Opteran Direct, our novel SLAM technology. The Dev Kit’s See technology will also be upgradable via compatibility with our next product, Eye, when it releases later this year. We are also happy to explore one-off licensing opportunities, so please do reach out to us via our website.

Alex Cope
Chief Technology Officer, Opteran Technologies

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 North California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top