fbpx

AiM Future Demonstration of Multi-modal Inference

Bob Allen, Vice President of Marketing and Business Development at AiM Future, demonstrates the company’s latest edge AI and vision technologies and products at the September 2022 Edge AI and Vision Innovation Forum. Specifically, Allen demonstrates the company’s Neuromosaic processor executing multi-modal inference.

The processor provides the flexibility to execute multiple deep learning models simultaneously. In this particular demonstration, it is performing real-time object detection, classification and segmentation on a series of images using the MobileNet v2 and Tiny-Yolo v3 models.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top