Skip to content

Posts tagged ‘IEEE-SA’

Apkudo Joins the IEEE Standards Association Camera Phone Image Quality Working Group as Corporate Member

August 22, 2012

Apkudo

New IEEE Standard Project Designed to Help Consumers Judge Camera Phone Image Quality When Making Purchasing Decisions

August 22, 2012 – Baltimore – Apkudo Inc., an Android innovation start-up, today announced it has joined the IEEE Standards Association (IEEE-SA) as a Corporate Member to help drive the development of standards to evolve camera phone image quality. The new IEEE P1858™ – Draft Standard for Camera Phone Image Quality (CPIQ) – development project aims to specify methods and metrics for analyzing a mobile device’s image quality. As leaders in Android device analysis via Apkudo Device Analytics, Apkudo helps worldwide mobile device manufacturers and network operators assess every aspect of the user experience of Android devices – including the camera – well before launch, providing an opportunity to optimize the performance, better target the device, and improve customer satisfaction.

“We have analyzed the camera performance of every Android device that’s ever been released. The amount of data we’ve collected is immense – thousands of data points on metrics such as shutter lag, shot-to-shot latency, time to first shot, sharpness, color accuracy and saturation, vignetting, color temperature – the list goes on,” said Josh Matthews, Apkudo CEO and co-founder. “We’re currently using that data to assist OEMs and Operators to improve their pre-launch devices, and we intend on now using it for the good of the ecosystem at large by helping the IEEE P1858™ Working Group set a standard for camera phone image quality.”

The IEEE P1858™ standard will address the fundamental attributes that contribute to image quality, as well as identify existing metrics and other useful information relating to these attributes. It will define a standardized suite of objective and subjective test methods for measuring camera phone image quality attributes, and it will specify tools and test methods to facilitate standards-based communication and comparison among carriers, handset manufacturers, and component vendors regarding camera phone image quality.

”We are excited to welcome Apkudo as an IEEE-SA Corporate Member,” said Mary Lynne Nielsen, director, corporate programs, IEEE-SA. ”Their expertise in analysis of the mobile sector will benefit the IEEE P1858 Working Group’s efforts in driving the CPIQ standard to fruition, and their practical experience will help stimulate widespread market adoption.” Apkudo joins a diverse and dynamic group of IEEE Standards Association corporate members that includes global platforms Google and Microsoft; telecom operator AT&T; graphics card developer NVIDIA; image processing companies Imatest and Omnivision Technologies; and semiconductor manufacturers Qualcomm, STMicroelectronics, Analog Devices Inc., Intel, and AMD. The IEEE P1858 working group convened for the first time August 16-17, 2012 in Seattle.

About Apkudo Inc.
Apkudo is an Android innovation company that optimizes user experience and performance across the industry’s most comprehensive portfolio of Android devices and applications. Apkudo’s mission is to help the Android ecosystem overcome the major challenge inherent in the open nature of the Android platform: fragmentation.

Apkudo has offices in Sydney, Baltimore, Chicago, San Diego, and London. The Apkudo team includes deep research expertise with the Android operating system including optimization, usability and performance testing, and is focused on helping the ecosystem deliver the best possible user experience to Android device owners. Learn more at http://www.apkudo.com and email sayhi@apkudo.com for additional information.

About the IEEE Standards Association
The IEEE Standards Association, a globally recognized standards-setting body within IEEE, develops consensus standards through an open process that engages industry and brings together a broad stakeholder community. IEEE standards set specifications and best practices based on current scientific and technological knowledge. The IEEE-SA has a portfolio of over 900 active standards and more than 500 standards under development. For more information visit the IEEE-SA website.

Android Camera Quality: objectively assessing those happy snaps

August 16, 2012

Benjamin Tseng

It can be debated that the camera is one of the most used features of a smartphone, and thus one of the most important. It’s no wonder too – the inclusion of a small and convenient camera subsystem in the everyday phone has created more opportunities for capturing those quick snaps that wind up on Facebook, your wallpaper, or in the family album. Wherever it ends up though, no one wants the blurry photo of Sasquatch, or even something that was missed entirely (some of the device’s we’ve seen are so slow, they’d only capture an empty track during the 100m sprint!).

While imaging sensors and lenses in smartphones have come a long way since the days of non-autofocus VGA shooters, even the best snappers struggle to match the performance of a basic pocket point and shoot camera. At Apkudo, via our Apkudo Device Analytics product, we deeply assess the user experience of pre-launch Android devices for most OEMs worldwide (we have over 6 pre-launch devices in our lab just this week!). It’s a lot of fun, and interestingly, by a reasonable margin, the most common user experience issues we discover are camera related –  ranging from low shutter speeds, to distortion effects, to poor color accuracy, and beyond.

One reason for this massive variation in camera quality is the fact that phone cameras are really only marketed by their megapixel rating, something that is a flawed metric in itself. Only recently have some OEMs tried to improve on this by touting a zero shutter-delay on their ICS devices, particularly as it is a highlight feature of ICS.

However, this is only a small aspect of the user experience relating to camera response and image quality. Many manufacturers advertise that their camera is really awesome, or it is excellent at low light, or captures images with vivid and stunning colors. Unfortunately, there is no metric to gauge this claim by, or to compare with one another. Is my camera that takes “Lifelike images with stunning realism” better than that one which does “incredible color and vividness”, or that other one which tells me it has “superb camera quality”?

It comes as no surprise that these claims mean little when it comes time for a user to decide which phone camera will best satisfy their needs.

So how can this be situation be improved? Standards! In particular, standards whereby various metrics of phone camera can be objectively compared, just like they are with professional DSLR’s.

And that is why I’m currently 36,000ft in the air, writing this blog on the way to the first meeting of IEEE P1858™ – the Draft Standard for Camera Phone Image Quality (CPIQ) working group. Apkudo has joined specifically to take part in setting a standard for camera phone image quality.

And what can Apkudo offer in this area? Simple. We have assessed every Android device that’s ever been released. The amount of data we’ve collected is immense – thousands of data points on metrics such as shutter lag, shot-to-shot latency, sharpness across the image, color accuracy and saturation, vignetting, time to first shot, color temperature – the list goes on. We’re currently using that data to assist OEMs and Operators to improve their pre-launch devices, and we intend on now using it to help the ecosystem at large.

Here’s a very small sampling of that data to whet your appetite:

Sharpness Graph

Here we’re  looking at a bit of sharpness at some points across a sample image. MTF50 can be correlated to perceived image sharpness, whereas overshoot corresponds well with post-process sharpening techniques by increasing perceived contrast between edges. Obviously having an overly low MTF50 will result in blurry images. However, what it doesn’t show is artificially introduced sharpness. This is where the overshoot measure helps. The HTC Desire VC has a good sharpness by MTF50 measure on first glance, but take a closer look, and it’s over sharpening is through the roof. And this,  leads to a common ‘halo’ effect affecting all edges.

HTC Desire VC with halo effect.

Samsung Galaxy Nexus.

On the other hand, the Galaxy Nexus seems to have a lower perceived sharpness. However, it has very little post processing applied, so it’s doing pretty well without much assistance.

A few more metrics:

Shutter lag, color, and Imatest results.

Here we’re looking at some camera delay tests, some camera saturation accuracy and distortion, and Imatest results. It’s fairly interesting that the One S is just about twice as quick at firing up that camera app than the other devices, including other ICS devices and the Galaxy Nexus rocking some Android Jelly Bean action.

Shot-to-shot is always an interesting one, and there seems to be two clear groups of performers. Ones that take more than a second per image, and ones that take less than half a second. Granted some of these devices have a burst mode that speeds things up a little, but we reckon that if you’re in a hurry to take a few quick pics in succession, you’re not going to have time to find that burst mode anyway.

We found the shutter lag to be pretty good on these phones, given that most of them are at least Android 4.0 (and so featuring that zero shutter lag perk). Interestingly though, the Motorola Droid 4 we analyzed is still running Gingerbread, yet keeps up with its younger brethren.

Below, we can see how each device handles color reproduction, saturation and even just to the naked eye, exposure. Some phones have a tendency to over expose, whereas some tend to underexpose. Others, like the Samsung SGS3, seem to get this just right. Color temperature is also exposed to the naked eye through the bottom row of grey patches. Of course this is all going to be dependant on the viewing monitor, but assuming it’s calibrated, the Ascend D1 and Desire VC both show a fairly warm tint, whereas the Droid 4 is hitting the other end of the scale with cool temperatures. Some of the complex analysis is handled by the excellent software produced by Imatest, which provides accurate and reproducible data that is invaluable to the classification and assessment of a phone camera.

Samsung Galaxy S3

HTC One S

Samsung/Google Galaxy Nexus

Huawei Ascend D1

HTC Desire VC

HTC One X

Motorola Droid 4

The results we’ve presented just begin to scratch the surface of the camera analysis we do here at Apkudo. Analyzing so many devices for OEMs and Operators affords us the opportunity to provide lots of insight and actionable recommendations as to why a camera might perform poorly and what can be done to fix it. When we saw that the IEEE would be putting together a working group to create a standard for methods and metrics to judge camera phone image quality, we knew we’d have a lot to offer. We are a company whose mission is to optimize the Android user experience – not just for OEMs, Operators, and developers – but for everyone. We believe we can help lead the charge in creating this industry standard to reach the end goal of a better user experience. For everyone.


Benjamin Tseng | Director of Device Analytics
Apkudo Inc.
+1 443 453 3172
+61 422 387 773