Wednesday, November 16, 2011

A Treatise On Digital Noise

Noise measurements garner a great deal of flak on message boards across the interwebs. Much of this is directed at whatever website is doing the measuring by another website that doesn't have has many readers.

A website that receives more than its fair share of this flak is DPReview, which measures per-pixel noise in its charts as opposed to overall image noise. This means that camera with smaller photosites will usually have a disadvantage over cameras with larger ones, and cameras with higher pixel counts will lag behind competitors with fewer, even if overall noise is identical.

The only website of which I am aware that measures overall image noise is DxO Mark, but this has practical problems as well. I'm not sure why, but their measurements very frequently do not jibe with my everyday experiences with cameras. For example, I'm a Micro 4/3 fan, and they list the Panasonic GH2 as inferior to the GH1, even though every photo that I ever took confirmed that the GH2 was superior, or at the very least identical.

For many reasons, I fall on the side of DPReview. In much the same way that I argue pixel-peeping is actually a worthwhile endeavor, analyzing pixels as opposed to the image is important. The only reason that I will address today is low-light noise, which is frequently incorrectly analyzed when looking at an entire picture.

In a well-lit environment, where every pixel is exposed beyond the noise floor and thus each pixel has true data, an overall image analysis is very accurate. But the instant that light levels fall and the environment gets more challenging, pixels will start to fall below the noise floor. One large pixel when exposed to a poorly-lit environment is much less likely to fall below the noise floor than a smaller pixel. So even if you have four pixels, averaged together, they've all fallen below their noise floor, resulting in no actual data. I use images from DPReview to illustrate that they are correct in their method.


In the above images, we have the SLT A77, A55, and the NEX-7. According to DxOMark, and a recent article at an A77-defending/DPReview-attacking website, the A55 and the A77 should be identical, and that any difference is the fault of DPReview's analysis protocol. Look at the 100% patch of darkness at the top. The A77 is much noisier than either the A55 or the NEX-7. "Ohhhhhh", they will say, "that is at 100%! We must analyze the whole image!"

Ok. Fair enough. Again, DxOMark and this website say the A55 should be almost identical with the A77. Then why, even at the small size I am using for the actual article layout, is the A77 noticeably noisier? Look at the blue cast, the reduced reds and greens, and the overall loss of contrast. Click on the image to get the large version and the difference is even more obvious. DxOMark shows no difference. DPReview does. In the actual end result image, there is a difference. Not a small, kinda'-sorta' there difference, either. An obvious one, even at low-resolution.

This impression is only strengthened by appeal to DPR's Studio Comparison tool. At every ISO setting, even when controlling for variances in exposure, the A55 pulls more detail out of the dark areas. That means that DPReview is right, DxOMark is wrong, as is the website attacking DPReview.

It is in this environment that ISO becomes critical to good images, and it is in this environment where overall image analysis becomes less important than a quick check at pixel-level performance. Because why bother with an overall analysis when one can quickly analyze a small group of poorly-exposed pixels and immediately be aware of the sensor's characteristics?

DPReview is the lord of camera review sites for a reason. They get shit right. They may not be as scathing in some of their reviews as they should be, but the raw materials to make your own conclusion are uploaded at high-resolution, without stupid crops and page-after-page of self-congratulating talk about art and photography. That is the reason why they are far and away the #1 photography website on the planet. They have no pretenses. They review cameras. That's all.

All of that said, I think that since cameras are tools, a website should start reviewing them based on tasks completed. For example, to determine the quality of a sensor, a color chart should be placed on a wall with a very dim, white light. Then, don't just publish the images, publish what settings were required to successfully image that chart such that all of the colors were represented correctly. That lets the photographer know in how extreme of an environment the tool will still successfully function.

Oh, and I should add that I do not dislike DxOMark. In fact, it is my second favorite photography website behind DPReview. Its data is something that should be taken as part of a gestalt of data from it and other websites , but very few websites are as thorough, extensive, and expansive as DxO. Most of the time, my personal experiences jibe perfectly well with DxO, but that is compared to DPReview, where my personal experiences essentially always jibe with their work.

No comments:

Post a Comment

All posts are moderated, so it may take a day for your comment to appear.