TECHNOLOGIES

Figure1

Using Xilinx FPGAs to Solve Endoscope System Architecture Challenges

By Jon Alexander, Technical Marketing Manager for ISM (Industrial, Scientific, Medical) Markets Xilinx Corporation Image enhancement functions – noise reduction, edge enhancement, dynamic range correction, digital zoom, scaling, etc – are key elements of many embedded vision designs, in improving the ability for downstream algorithms to automatically extract meaning from the image. Interface flexibility and […]

Using Xilinx FPGAs to Solve Endoscope System Architecture Challenges Read More +

Figure 3_IP Rated Smart Camera

Eight Considerations When Evaluating a Smart Camera

By Carlton Heard Product Engineer – Vision Hardware and Software National Instruments With the increase in performance and decrease in cost, smart cameras have become increasingly more accessible over the past decade. Given this trend, how do you determine which smart camera best meets your needs or decide if a smart camera is appropriate for

Eight Considerations When Evaluating a Smart Camera Read More +

September 2012 Embedded Vision Summit Presentation: “Optimization and Acceleration for OpenCV-Based Embedded Vision Applications,” Bo Wu, Synopsys

Bo Wu, Technical Marketing Manager at Synopsys, presents the "Optimization and Acceleration for OpenCV-Based Embedded Vision Applications" tutorial within the "Using Tools, APIs and Design Techniques for Embedded Vision" technical session at the September 2012 Embedded Vision Summit.

September 2012 Embedded Vision Summit Presentation: “Optimization and Acceleration for OpenCV-Based Embedded Vision Applications,” Bo Wu, Synopsys Read More +

“Exposing the Android Camera Stack,” a Technology Tutorial from Aptina Imaging

Reach beyond the lens with Balwinder Kaur and Joe Rickson of Aptina Imaging as they break down Android's camera stack for the San Francisco Android User Group on August 28, 2012, including what's new and changed in "Jelly Bean" (Android 4.1). Ms. Kaur covers camera APIs and their functions inside Android. Mr. Rickson then discusses

“Exposing the Android Camera Stack,” a Technology Tutorial from Aptina Imaging Read More +

Hot Chips Symposium Presentation Underscores Analog Devices’ Commitment to Vision Applications

Attendees at the 2012 HOT CHIPS symposium recently learned about Analog Devices’ new BF60x family of video and vision processors and their associated capabilities. Robert Bushey, principal architect and technologist, Embedded Systems Products and Technology Business Unit, presented about the BF60x family at the gathering, which is one of the semiconductor industry’s leading conferences on

Hot Chips Symposium Presentation Underscores Analog Devices’ Commitment to Vision Applications Read More +

“Machine Learning,” a Presentation from UT Austin

Professor Kristen Grauman of the University of Texas at Austin presents the keynote on machine learning at the December 2012 Embedded Vision Alliance Member Summit. Grauman is a rising star in computer vision research. Among other distinctions, she was recently recognized with a Regents' Outstanding Teaching Award and, along with Devi Parikh, received the prestigious

“Machine Learning,” a Presentation from UT Austin Read More +

Embedded Vision Alliance Conversation with Daniel Wilding of National Instruments

Jeff Bier, founder of the Embedded Vision Alliance, interviews Daniel Wilding, National Instruments Digital Hardware Engineer. Jeff and Daniel discuss National Instruments' history, current status and future plans in the embedded vision application space, the applicability and advantages of FPGAs as embedded vision processors, and National Instruments' development tools for simplifying and otherwise optimizing FPGA-based

Embedded Vision Alliance Conversation with Daniel Wilding of National Instruments Read More +

December 2012 Embedded Vision Alliance Member Summit Technology Trends Presentation

Embedded Vision Alliance Editor-in-Chief (and BDTI Senior Analyst) Brian Dipert and BDTI Senior Software Engineer Eric Gregori co-deliver an embedded vision application technology trends presentation at the December 2012 Embedded Vision Alliance Member Summit. Brian and Eric discuss embedded vision opportunities in mobile electronics devices. They quantify the market sizes and trends for smartphones and

December 2012 Embedded Vision Alliance Member Summit Technology Trends Presentation Read More +

dof

Post-Capture Selective Focus: A Video-Capable DSLR Lets You Put The Concept To The Test

Interested in trying out plenoptic light field camera technology, but don't have access to a Lytro camera (or for that matter, a Toshiba sensor prototype)? The Chaos Collective has developed a free online tool that enables you to approximate the approach using a recent-model DSLR that's capable of capturing not only still images but also

Post-Capture Selective Focus: A Video-Capable DSLR Lets You Put The Concept To The Test Read More +

toshfocus2

Post-Capture Selective Focus: Toshiba’s Prototype Puts Lytro On Notice

Plenoptic camera technology, most commonly known nowadays by virtue of Lytro's ongoing promotion of the concept (and sales of the first-generation implementation), has received primary mainstream attention to date because the light field-based approach allows for post-capture selective focus on particular depth regions of an image. However, embedded vision advocates likely are alternatively intrigued by

Post-Capture Selective Focus: Toshiba’s Prototype Puts Lytro On Notice Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top