Brian Dipert

All_Programmable_Lock_up

Xilinx Showcases Vision Guided Machine Learning with reVISION at the Embedded Vision Summit 2017

SAN JOSE, Calif., April 24, 2017 – Xilinx, Inc. (NASDAQ: XLNX) will showcase its new reVISION™ stack for vision guided machine learning applications at the Embedded Vision Summit 2017. Through in-booth demonstrations and paper presentations, Xilinx and global channel distribution partner, Avnet, will show how Xilinx's tools, libraries and methodologies infuse machine learning, computer vision, […]

Xilinx Showcases Vision Guided Machine Learning with reVISION at the Embedded Vision Summit 2017 Read More +

ewbm logo final export_Color Small

Revolutionary SoC for 3D Depth Extraction with a Single Camera: eWBM’s DR1152

eWBM Co. Ltd, a Korean fabless SoC company, is releasing its depth extraction SoC: DR1152. This device generates 3D depth information of an object, as well as RGB and IR images without using multiple sensors or an IR emitter. SEOUL, South Korea, April 24, 2017 – DR1152 generates the distance information of an object with

Revolutionary SoC for 3D Depth Extraction with a Single Camera: eWBM’s DR1152 Read More +

DragonBoard 410c Camera Kit_with Camera_600_400

DesignCore Camera Mezzanine Board for Arrow DragonBoard 410c Speeds Development of Embedded Vision Applications

Direct MIPI CSI-2 access to camera data enables more realistic evaluation of lower power, higher performance embedded vision systems and proof-of-concept prototypes Rochester, NY – April 20, 2017 – D3 Engineering today announced its DesignCore™ Camera Mezzanine Board OV5640 for the Arrow DragonBoard™ 410c. D3 will demonstrate the camera mezzanine board and DragonBoard™ 410c Camera

DesignCore Camera Mezzanine Board for Arrow DragonBoard 410c Speeds Development of Embedded Vision Applications Read More +

Figure1

The Internet of Things That See: Opportunities, Techniques and Challenges

This article was originally published at the 2017 Embedded World Conference. With the emergence of increasingly capable processors, image sensors, and algorithms, it's becoming practical to incorporate computer vision capabilities into a wide range of systems, enabling them to analyze their environments via video inputs. This article explores the opportunity for embedded vision, compares various

The Internet of Things That See: Opportunities, Techniques and Challenges Read More +

imglogo2013

Imagination Showcases OpenVX with CNN Extensions on PowerVR GPUs at Embedded Vision Summit 2017

London, UK – April 19th, 2017 – Imagination Technologies (IMG.L) announces its participation at the annual Embedded Vision Summit 2017. Embedded Vision Summit showcases the latest applications, techniques, technologies and opportunities in computer vision and deep learning, along with demonstrations of the latest technologies that enable vision-based capabilities. Who:  Imagination, a global leader in multimedia,

Imagination Showcases OpenVX with CNN Extensions on PowerVR GPUs at Embedded Vision Summit 2017 Read More +

vcsbc-nano-Z-RH-2

VC Z Series: High-speed Smart Cameras for OEM Applications from Vision Components

April 7, 2017 – The powerful, extremely small VC Z embedded vision systems from Vision Components offer real-time image processing suitable for all kinds of demanding applications. All models are equipped with Xilinx’ Zynq module, a dual-core ARM Cortex-A9 with 866 MHz and an integrated FPGA. Their extremely compact design and very low power consumption

VC Z Series: High-speed Smart Cameras for OEM Applications from Vision Components Read More +

Furian-Iguana-final-sm-1024x540

Imagination at the Embedded Vision Summit 2017

This blog post was originally published at Imagination Technologies' website. It is reprinted here with the permission of Imagination Technologies. The Imagination PowerVR team are busily preparing for the Embedded Vision Summit 2017 (EVS), taking place from 1-3 May in Santa Clara. EVS is a great industry event for all those involved with vision and surrounding technologies. From

Imagination at the Embedded Vision Summit 2017 Read More +

Intel Demonstration of Single-camera Neural Network-based Object Detection

Will Lin from Intel's Automotive Business Unit demonstrates the company's latest embedded vision technologies and products at the January 2017 Consumer Electronics Show. Specifically, Lin demonstrates multi-class object detection for autonomous driving on live Chinese streets with busy traffic. The hardware and software integration showcases a complete ADAS object detection solution on Intel FPGA platform

Intel Demonstration of Single-camera Neural Network-based Object Detection Read More +

Intel Demonstration of a Multi-Camera Neural Network Acceleration Platform

Paulo Borges, Autonomous Driving Strategic Business Manager in the Programmable Solutions Group at Intel, demonstrates the company's latest embedded vision technologies and products at the January 2017 Consumer Electronics Show. Specifically, Borges demonstrates a hardware and software platform for showcasing the possibilities of current technology in autonomous driving, focusing on multi-camera visual understanding of the

Intel Demonstration of a Multi-Camera Neural Network Acceleration Platform Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top