fbpx

Cameras and Sensors for Embedded Vision

WHILE ANALOG CAMERAS ARE STILL USED IN MANY VISION SYSTEMS, THIS SECTION FOCUSES ON DIGITAL IMAGE SENSORS

While analog cameras are still used in many vision systems, this section focuses on digital image sensors—usually either a CCD or CMOS sensor array that operates with visible light. However, this definition shouldn’t constrain the technology analysis, since many vision systems can also sense other types of energy (IR, sonar, etc.).

The camera housing has become the entire chassis for a vision system, leading to the emergence of “smart cameras” with all of the electronics integrated. By most definitions, a smart camera supports computer vision, since the camera is capable of extracting application-specific information. However, as both wired and wireless networks get faster and cheaper, there still may be reasons to transmit pixel data to a central location for storage or extra processing.

A classic example is cloud computing using the camera on a smartphone. The smartphone could be considered a “smart camera” as well, but sending data to a cloud-based computer may reduce the processing performance required on the mobile device, lowering cost, power, weight, etc. For a dedicated smart camera, some vendors have created chips that integrate all of the required features.

Cameras

Until recent times, many people would imagine a camera for computer vision as the outdoor security camera shown in this picture. There are countless vendors supplying these products, and many more supplying indoor cameras for industrial applications. Don’t forget about simple USB cameras for PCs. And don’t overlook the billion or so cameras embedded in the mobile phones of the world. These cameras’ speed and quality have risen dramatically—supporting 10+ mega-pixel sensors with sophisticated image processing hardware.

Consider, too, another important factor for cameras—the rapid adoption of 3D imaging using stereo optics, time-of-flight and structured light technologies. Trendsetting cell phones now even offer this technology, as do latest-generation game consoles. Look again at the picture of the outdoor camera and consider how much change is about to happen to computer vision markets as new camera technologies becomes pervasive.

Sensors

Charge-coupled device (CCD) sensors have some advantages over CMOS image sensors, mainly because the electronic shutter of CCDs traditionally offers better image quality with higher dynamic range and resolution. However, CMOS sensors now account for more 90% of the market, heavily influenced by camera phones and driven by the technology’s lower cost, better integration and speed.

“Are Neuromorphic Vision Technologies Ready for Commercial Use?,” An Embedded Vision Summit Expert Panel Discussion

Sally Ward-Foxton, European Correspondent for EE Times, moderates the “Are Neuromorphic Vision Technologies Ready for Commercial Use?” Expert Panel at the May 2022 Embedded Vision Summit. Other panelists include Garrick Orchard, Research Scientist at Intel Labs, James Marshall, Chief Scientific Officer at Opteran, Ryad Benosman, Professor at the University of… “Are Neuromorphic Vision Technologies Ready

Read More »

What is a UVC Camera, and What are the Different Types of UVC Cameras?

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. AMRs (Autonomous Mobile Robots), smart signages, barcode scanning devices, surveillance systems, industrial handhelds, etc., are changing how businesses operate and connect with people. And embedded vision cameras play a vital role in enabling these devices

Read More »

“Building Embedded Vision Products: Management Lessons From The School of Hard Knocks,” a Presentation from the Edge AI and Vision Alliance

Phil Lapsley, Vice President of the Edge AI and Vision Alliance, presents the “Building Embedded Vision Products: Management Lessons From The School of Hard Knocks” tutorial at the May 2022 Embedded Vision Summit. It’s hard to build embedded AI and vision products, and the challenges aren’t just technical. In this… “Building Embedded Vision Products: Management

Read More »

May 2022 Embedded Vision Summit Opening Remarks (May 18)

Jeff Bier, Founder of the Edge AI and Vision Alliance, welcomes attendees to the May 2022 Embedded Vision Summit on May 18, 2022. Bier provides an overview of the edge AI and vision market opportunities, challenges, solutions and trends. He also introduces the Edge AI and Vision Alliance and the… May 2022 Embedded Vision Summit

Read More »

May 2022 Embedded Vision Summit Opening Remarks (May 17)

Jeff Bier, Founder of the Edge AI and Vision Alliance, welcomes attendees to the May 2022 Embedded Vision Summit on May 17, 2022. Bier provides an overview of the edge AI and vision market opportunities, challenges, solutions and trends. He also introduces the Edge AI and Vision Alliance and the… May 2022 Embedded Vision Summit

Read More »

“Event-Based Neuromorphic Perception and Computation: The Future of Sensing and AI,” a Keynote Presentation from Ryad Benosman

Ryad Benosman, Professor at the University of Pittsburgh and Adjunct Professor at the CMU Robotics Institute, presents the “Event-Based Neuromorphic Perception and Computation: The Future of Sensing and AI” tutorial at the May 2022 Embedded Vision Summit. We say that today’s mainstream computer vision technologies enable machines to “see,” much… “Event-Based Neuromorphic Perception and Computation:

Read More »

The AI Semiconductor Market 1st Half 2022

This market research report was originally published at Woodside Capital Partners’ website. It is reprinted here with the permission of Woodside Capital Partners. Palo Alto – July 29, 2022 – Woodside Capital Partners (WCP) is pleased to share our Industry Report on the AI Semiconductor Market 1st Half 2022, authored by Managing Director Shusaku Sumida.

Read More »

“Deploying Visual AI on Edge Devices: Lessons From the Real World,” a Presentation from Teledyne Imaging

Luc Chouinard, AI Specialist and Design Architect at Teledyne Imaging, presents the “Deploying Visual AI on Edge Devices: Lessons From the Real World” tutorial at the May 2022 Embedded Vision Summit. Developing an AI-based edge device is complex, involving hardware, algorithms and software. It requires making many technology-selection decisions and… “Deploying Visual AI on Edge

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top