In the introductory essay to last Thursday's Embedded Vision Insights newsletter, and in speaking of Nokia's new cameraphone, I wrote:

The Nokia 808 contains a 41 Mpixel image sensor (no, that's not a typo), notable not only for its high resolution but also for its relatively relaxed 1.4 um pixel pitch, the latter translating into larger-than-otherwise silicon die size and cost but also to better-than-otherwise low-light performance.

And after a few-sentence diversion into a digital zoom discussion, I continued:

The largest resolution still images that the Nokia 808 can capture are 38 Mpixels (7,152 x 5,368 pixels) in 4:3 aspect ratio mode, and 34 Mpixels (7,728 x 4,354 pixels) in 16:9 aspect ratio mode. Alternatively, the PureView algorithms combine multiple pixels' data together in creating lower-resolution 8 Mpixel, 5 Mpixel or 3 Mpixel photographs. The resultant oversampling not only improves the per-pixel light sensitivity, it also enhances image sharpness.

If it wasn't clear to you why I was discussing "low-light performance" and "per-pixel light sensitivity", the above video (with thanks to the Cult of Mac for the heads-up) may be of interest. Dylan A Bennett does a fairly thorough job of explaining how a pixel's photodiode can discern "light" even in the absence of visual-spectrum illumination, leading to injected noise that can negatively affect the appearance of captured images. I only have a couple of nits:

  • Bennett doesn't discuss the influence that ambient temperature can have on the amount of generated image sensor noise. This relationship is why, for example, astrophotography setups tend to (because of the long-duration exposure times) super-cool the image sensor(s), as a noise-suppression scheme.
  • Speaking of time, although this may already be obvious to some, longer-exposure image captures are more prone to noise degradation than are short-duration ones
  • And speaking of exposures, Bennett doesn't go into detail on the relationship between pixel size and the potential for sensor noise to adversely impact image quality. Noise in and of itself isn't the problem, simplistically speaking; the signal-to-noise ratio is the important metric. Bright light (therefore short exposure) image capture environments are most likely to deliver a high SNR. Low-light environments (i.e. the well-known 'bar-interior photography' metric) are conversely a more challenging test. And as each pixel decreases in size, the result of attempting to pack ever-higher pixel counts into a given-sized sliver of silicon, its light-capture capabilities diminish in spite of micro-lenses, backside illumination and other workarounds. This is why Nokia's lower-resolution PureView modes are intriguing. Although the sensor's per-pixel pitch is fairly aggressive (though not as much as it could be…many high-volume sensors now have even smaller 1.1 um dimensions), PureView combines the outputs of multiple adjacent image sensor pixels in order to boost the effective signal for each pixel of the final captured picture. And since SNR is a logarithmic (not linear) function, this additive-signal algorithm can notably improve the end result, even though multiple photodiodes' noise outputs are also combined via it.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.



1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone: +1 (925) 954-1411
Scroll to Top