Enabling Technologies

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek

Bing Yu, Senior Technical Manager and Architect at MediaTek, presents the “MediaTek’s Approach for Edge Intelligence” tutorial at the May 2019 Embedded Vision Summit. MediaTek has incorporated an AI processing unit (APU) alongside the traditional CPU and GPU in its SoC designs for the next wave of smart client devices (smartphones, cameras, appliances, cars, etc.). […]

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek Read More +

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar

Sylvain Dubois, Vice President of Business Development and Marketing at Crossbar, presents the “Memory-centric Hardware Acceleration for Machine Intelligence” tutorial at the May 2019 Embedded Vision Summit. Even the most advanced AI chip architectures suffer from performance and energy efficiency limitations caused by the memory bottleneck between computing cores and data. Most state-of-the-art CPUs, GPUs,

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar Read More +

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm

Robert Lay, Computer Vision and Camera Product Manager at Qualcomm, presents the “Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications” tutorial at the May 2019 Embedded Vision Summit. Advances in imaging quality and features are accelerating, thanks to hybrid approaches that combine classical computer vision and deep learning algorithms and that take advantage of

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm Read More +

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas

Yoshio Sato, Senior Product Marketing Manager in the Industrial Business Unit at Renesas, presents the “Dynamically Reconfigurable Processor Technology for Vision Processing” tutorial at the May 2019 Embedded Vision Summit. The Dynamically Reconfigurable Processing (DRP) block in the Arm Cortex-A9 based RZ/A2M MPU accelerates image processing algorithms with spatially pipelined, time-multiplexed, reconfigurable- hardware compute resources.

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas Read More +

“Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling,” a Presentation from Mythic

Mike Henry, CEO and Founder of Mythic, presents the “Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling” tutorial at the May 2019 Embedded Vision Summit. AI inference at the edge will continue to create insatiable demand for compute performance in power- and cost-constrained form factors. Taking into account past trends,

“Pioneering Analog Compute for Edge AI to Overcome the End of Digital Scaling,” a Presentation from Mythic Read More +

“The Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability,” a Presentation from Xilinx

Nick Ni, Director of Product Marketing at Xilinx, presents the “Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability” tutorial at the May 2019 Embedded Vision Summit. AI inference demands orders- of-magnitude more compute capacity than what today’s SoCs offer. At the same time, neural network topologies are changing too quickly to be addressed by

“The Xilinx AI Engine: High Performance with Future-proof Architecture Adaptability,” a Presentation from Xilinx Read More +

“Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs,” a Presentation from Qualcomm

Felix Baum, Director of Product Management for AI Software at Qualcomm, presents the “Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs” tutorial at the May 2019 Embedded Vision Summit. Increasingly, machine learning models are being deployed at the edge, and these models are getting bigger. As a result, we are hitting

“Efficient Deployment of Quantized ML Models at the Edge Using Snapdragon SoCs,” a Presentation from Qualcomm Read More +

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies

Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, presents the “Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit” tutorial at the May 2019 Embedded Vision Summit. In this presentation, Taylor describes methods and tools for developing, profiling and optimizing neural network solutions for deployment on Arm MCUs, CPUs and

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies Read More +

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies

Walter Bell, 3D Imaging Application Engineer at Infineon Technologies, presents the “REAL3 Time of Flight: A New Differentiator for Mobile Phones” tutorial at the May 2019 Embedded Vision Summit. In 2019, 3D imaging has become mainstream in mobile phone cameras. What started in 2016 with the first two smartphones using an Infineon 3D time-of-flight (ToF)

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top