Do megapixels make a difference?

Ryan Redern

I have a Sony Xperia xa ultra and it has a 21.5mp camera, it's a very good camera but this is just a question; why do phones such as Samsung's have 16 megapixel cameras and can take pictures/record in 4k, but my phone is HD. Nothing wrong with HD, it's just a question. Surely higher megapixels mean a better camera? Especially if there's almost 6 more. Just curious. Thanks :)


Moderati ergo sum
How much time have you got? ;)

Unfortunately, for a while now phone and camera manufacturers have been hawking megapixels as the benchmark for quality. It just isn't so in the real world. It's just marketing and the public eats it up like ... um ... apples. :eek: ;)

So first let's discuss pixels. In theory a pixel is a point that contains information about its color, intensity, saturation and transparency (alpha). The more information a pixel contains, the more accurately an image will render. That's the theory, but in practice all pixels are not created equal. That 21.5 megapixel sensor in your Sony (assuming for a moment that 21.5 is hardware resolution and not some form of interpolation -- more on that later) has to pack 22+ million individual receptors on a chip smaller than a pencil eraser. That's a grid of around 4,700 X 4,700. That's tiny. And it's collecting light through a lens the size of the diameter of the lead in that pencil.

So here comes the photons ... the light that the sensor uses to register the proper information to create that theoretical pixel. Passing a fixed amount of photons through a tiny lens to a small sensor with a tightly packed grid means that each individual pixel is using a smallish amount of photons to create its pixel. Because the chip must be more sensitive than a bigger or less densely packed sensor, it is more likely to misread or create a pixel that is less accurate thus yielding an inferior image.

Now, this doesn't mean that more pixels equals poorer images, either. It is very much dependent on balancing the lens, sensor and the algorithm to create the image. Apple, for all the bashing we see from Android fanboys (and girls :thumbsupdroid:), is something they have always done very well. It took years for Android devices to catch up to their quality, even though for a while Android phones had more megapixels than iPhones.

It could also be that the sensor does't really have that many native pixels at all. It might have 14 MP or 18 MP resolution and the camera software interpolates the images to a larger size by approximating pixels in between the actual sensed pixels. Is this a 21 MP image? Sure, is it a 21 MP camera? No way. You have to be careful when you read the specs. A lower resolution native image interpolated to a higher resolution will be softer and include a good many artifacts so a high level of post processing is necessary, yielding an inferior image.

The app you use to control the camera makes a difference. Most, if not all, camera apps apply some levels of enhancement to the captured image. Samsung tends to over-sharpen and over-saturate the images, but some people think that makes the pictures more vibrant. Oh, and the format you save them in matters, too. JPEG is a 'lossy' format, meaning that the compression will throw information away to reduce file size. That's why a 21 MP JPEG file saved at low quality is more pixelated but much smaller than a 21 MP JPEG saved at medium or high quality. JPEG quality is really more adaptable than just those three resolutions, but again, most camera apps only give you limited choices.

If you want to see the definitive capability of your camera, see if it will capture a RAW image. This is pretty much taking the raw data that the sensor captures and saves it to a file so the post processing pixel generation can be handled by a better interpreter that what can be done with an app on a mobile phone.

The real question is how many pixels do you need? If you were taking a picture for print, that 21 MP image could easily be reproduced at 20" x 20" ... when was the last time you saw a 20" magazine on the news stand? To create an 8" x 10" photo print at professional resolution you only need 1/3 of your image (roughly 7 MP) and honestly, how many people even print 8" x 10" glossies from their phone images? Most end up in some digital format and even a 1080p, that's a mere 2 MP.

Got the pixel picture now?

As far as HD vs. 4k ... let's make that another discussion. ;)


Spacecorp test pilot
From a true camera layman's point of view (That's me) more megapixels means you can zoom in and crop a picture you took and still have a sharp photo if you print it out.
It can, if there is enough light and if the lens + processing are good enough. But it also means more readout noise, and so as the light levels fall the images become noisier, which at some point will undo any gain in resolution - how quickly depends on how good the processing is.

There isn't a simple rule to say what the optimum is: more pixels = more information, but also more noise. Improvements in the electronics, or greater photoefficiency in the sensor, will improve the S/N and so allow more pixels in the same area for the same light level before the noise becomes problematic. A wider aperture allows more light in, and of course a larger sensor collects more light (but requires a bigger camera module, and the moon on a stick brigade get upset about "camera bumps" while still demanding thin phones and high-quality cameras).

And of course, if the processing is poor then the extra pixels don't gain you anything even in good light. And if the lens doesn't match the resolution of the sensor then the gain from the extra pixels won't be as much as you might expect.

Personally I have the opposite instinct to the OP: for a typical phone camera sensor and current sensor technology I my preference would be 12MP, and I regard much higher pixel counts as a negative. In fact Sony are the last of the major manufacturers still choosing such high pixel counts - ironic, since they also supply the sensors for most of the others!

The other thing to remember is that linear resolution improves as the square root of the pixel count. Hence as 21MP sensor has pixels 30% smaller than a 16MP sensor of the same area, but gives only 15% finer linear resolution.
Last edited:


Moderati ergo sum
Mops and buckets are in the closet on the left for those whose heads have exploded. ;)

Seriously, the complexity of the issue is why marketing and the general public become fixated on pixel count the way PC manufacturers used to get hung up on processor speed. Eyes start to gloss over into the second sentence.

Now about 4k vs. HD ... again the theory and reality are a little divergent. A 4k (UHD) image can be clearer, more vibrant and more accurate than a 1080p (HD) image simply by virtue of the amount of information being presented. However, the weak link here is actually the human eye. There are limits to what it can differentiate.

I'm sure we all have seen side by side displays of 4K displays next to HD. That's fine for in the store where you are standing a foot or two away from the screen. When you are sitting at home watching TV, you most likely are at least 8 feet away and more likely more. You're eye's ability to register all those extra pixels diminishes exponentially as you get further away.

Assuming you have 20/20 vision in the first place, you'd need to be sitting about 12' from a 75" screen to see a substantial difference, and what was being displayed had to contain enough content to deliver a native 4K image. If it's not there the 4K display will simply enlarge an HD image making it less detailed.

I think the best application for this tightly packed display technology will be in VR headsets where the image is an inch from your eye.

Let me just say one more thing ... do we really need to be seeing 4K videos of other people's cats? UHD Selfies? Poster sized prints of your cousin's lunch? ;)