Skip to content

Posts tagged ‘Imatest’

Android Camera Quality: objectively assessing those happy snaps

August 16, 2012

Benjamin Tseng

It can be debated that the camera is one of the most used features of a smartphone, and thus one of the most important. It’s no wonder too – the inclusion of a small and convenient camera subsystem in the everyday phone has created more opportunities for capturing those quick snaps that wind up on Facebook, your wallpaper, or in the family album. Wherever it ends up though, no one wants the blurry photo of Sasquatch, or even something that was missed entirely (some of the device’s we’ve seen are so slow, they’d only capture an empty track during the 100m sprint!).

While imaging sensors and lenses in smartphones have come a long way since the days of non-autofocus VGA shooters, even the best snappers struggle to match the performance of a basic pocket point and shoot camera. At Apkudo, via our Apkudo Device Analytics product, we deeply assess the user experience of pre-launch Android devices for most OEMs worldwide (we have over 6 pre-launch devices in our lab just this week!). It’s a lot of fun, and interestingly, by a reasonable margin, the most common user experience issues we discover are camera related –  ranging from low shutter speeds, to distortion effects, to poor color accuracy, and beyond.

One reason for this massive variation in camera quality is the fact that phone cameras are really only marketed by their megapixel rating, something that is a flawed metric in itself. Only recently have some OEMs tried to improve on this by touting a zero shutter-delay on their ICS devices, particularly as it is a highlight feature of ICS.

However, this is only a small aspect of the user experience relating to camera response and image quality. Many manufacturers advertise that their camera is really awesome, or it is excellent at low light, or captures images with vivid and stunning colors. Unfortunately, there is no metric to gauge this claim by, or to compare with one another. Is my camera that takes “Lifelike images with stunning realism” better than that one which does “incredible color and vividness”, or that other one which tells me it has “superb camera quality”?

It comes as no surprise that these claims mean little when it comes time for a user to decide which phone camera will best satisfy their needs.

So how can this be situation be improved? Standards! In particular, standards whereby various metrics of phone camera can be objectively compared, just like they are with professional DSLR’s.

And that is why I’m currently 36,000ft in the air, writing this blog on the way to the first meeting of IEEE P1858™ – the Draft Standard for Camera Phone Image Quality (CPIQ) working group. Apkudo has joined specifically to take part in setting a standard for camera phone image quality.

And what can Apkudo offer in this area? Simple. We have assessed every Android device that’s ever been released. The amount of data we’ve collected is immense – thousands of data points on metrics such as shutter lag, shot-to-shot latency, sharpness across the image, color accuracy and saturation, vignetting, time to first shot, color temperature – the list goes on. We’re currently using that data to assist OEMs and Operators to improve their pre-launch devices, and we intend on now using it to help the ecosystem at large.

Here’s a very small sampling of that data to whet your appetite:

Sharpness Graph

Here we’re  looking at a bit of sharpness at some points across a sample image. MTF50 can be correlated to perceived image sharpness, whereas overshoot corresponds well with post-process sharpening techniques by increasing perceived contrast between edges. Obviously having an overly low MTF50 will result in blurry images. However, what it doesn’t show is artificially introduced sharpness. This is where the overshoot measure helps. The HTC Desire VC has a good sharpness by MTF50 measure on first glance, but take a closer look, and it’s over sharpening is through the roof. And this,  leads to a common ‘halo’ effect affecting all edges.

HTC Desire VC with halo effect.

Samsung Galaxy Nexus.

On the other hand, the Galaxy Nexus seems to have a lower perceived sharpness. However, it has very little post processing applied, so it’s doing pretty well without much assistance.

A few more metrics:

Shutter lag, color, and Imatest results.

Here we’re looking at some camera delay tests, some camera saturation accuracy and distortion, and Imatest results. It’s fairly interesting that the One S is just about twice as quick at firing up that camera app than the other devices, including other ICS devices and the Galaxy Nexus rocking some Android Jelly Bean action.

Shot-to-shot is always an interesting one, and there seems to be two clear groups of performers. Ones that take more than a second per image, and ones that take less than half a second. Granted some of these devices have a burst mode that speeds things up a little, but we reckon that if you’re in a hurry to take a few quick pics in succession, you’re not going to have time to find that burst mode anyway.

We found the shutter lag to be pretty good on these phones, given that most of them are at least Android 4.0 (and so featuring that zero shutter lag perk). Interestingly though, the Motorola Droid 4 we analyzed is still running Gingerbread, yet keeps up with its younger brethren.

Below, we can see how each device handles color reproduction, saturation and even just to the naked eye, exposure. Some phones have a tendency to over expose, whereas some tend to underexpose. Others, like the Samsung SGS3, seem to get this just right. Color temperature is also exposed to the naked eye through the bottom row of grey patches. Of course this is all going to be dependant on the viewing monitor, but assuming it’s calibrated, the Ascend D1 and Desire VC both show a fairly warm tint, whereas the Droid 4 is hitting the other end of the scale with cool temperatures. Some of the complex analysis is handled by the excellent software produced by Imatest, which provides accurate and reproducible data that is invaluable to the classification and assessment of a phone camera.

Samsung Galaxy S3

HTC One S

Samsung/Google Galaxy Nexus

Huawei Ascend D1

HTC Desire VC

HTC One X

Motorola Droid 4

The results we’ve presented just begin to scratch the surface of the camera analysis we do here at Apkudo. Analyzing so many devices for OEMs and Operators affords us the opportunity to provide lots of insight and actionable recommendations as to why a camera might perform poorly and what can be done to fix it. When we saw that the IEEE would be putting together a working group to create a standard for methods and metrics to judge camera phone image quality, we knew we’d have a lot to offer. We are a company whose mission is to optimize the Android user experience – not just for OEMs, Operators, and developers – but for everyone. We believe we can help lead the charge in creating this industry standard to reach the end goal of a better user experience. For everyone.


Benjamin Tseng | Director of Device Analytics
Apkudo Inc.
+1 443 453 3172
+61 422 387 773