How much time have you got?
Unfortunately, for a while now phone and camera manufacturers have been hawking megapixels as the benchmark for quality. It just isn't so in the real world. It's just marketing and the public eats it up like ... um ... apples.
So first let's discuss pixels. In theory a pixel is a point that contains information about its color, intensity, saturation and transparency (alpha). The more information a pixel contains, the more accurately an image will render. That's the theory, but in practice all pixels are not created equal. That 21.5 megapixel sensor in your Sony (assuming for a moment that 21.5 is hardware resolution and not some form of interpolation -- more on that later) has to pack 22+ million individual receptors on a chip smaller than a pencil eraser. That's a grid of around 4,700 X 4,700. That's tiny. And it's collecting light through a lens the size of the diameter of the lead in that pencil.
So here comes the photons ... the light that the sensor uses to register the proper information to create that theoretical pixel. Passing a fixed amount of photons through a tiny lens to a small sensor with a tightly packed grid means that each individual pixel is using a smallish amount of photons to create its pixel. Because the chip must be more sensitive than a bigger or less densely packed sensor, it is more likely to misread or create a pixel that is less accurate thus yielding an inferior image.
Now, this doesn't mean that more pixels equals poorer images, either. It is very much dependent on balancing the lens, sensor and the algorithm to create the image. Apple, for all the bashing we see from Android fanboys (and girls

), is something they have always done very well. It took years for Android devices to catch up to their quality, even though for a while Android phones had more megapixels than iPhones.
It could also be that the sensor does't really have that many native pixels at all. It might have 14 MP or 18 MP resolution and the camera software interpolates the images to a larger size by approximating pixels in between the actual sensed pixels. Is this a 21 MP image? Sure, is it a 21 MP camera? No way. You have to be careful when you read the specs. A lower resolution native image interpolated to a higher resolution will be softer and include a good many artifacts so a high level of post processing is necessary, yielding an inferior image.
The app you use to control the camera makes a difference. Most, if not all, camera apps apply some levels of enhancement to the captured image. Samsung tends to over-sharpen and over-saturate the images, but some people think that makes the pictures more vibrant. Oh, and the format you save them in matters, too. JPEG is a 'lossy' format, meaning that the compression will throw information away to reduce file size. That's why a 21 MP JPEG file saved at low quality is more pixelated but much smaller than a 21 MP JPEG saved at medium or high quality. JPEG quality is really more adaptable than just those three resolutions, but again, most camera apps only give you limited choices.
If you want to see the definitive capability of your camera, see if it will capture a RAW image. This is pretty much taking the raw data that the sensor captures and saves it to a file so the post processing pixel generation can be handled by a better interpreter that what can be done with an app on a mobile phone.
The real question is how many pixels do you need? If you were taking a picture for print, that 21 MP image could easily be reproduced at 20" x 20" ... when was the last time you saw a 20" magazine on the news stand? To create an 8" x 10" photo print at professional resolution you only need 1/3 of your image (roughly 7 MP) and honestly, how many people even print 8" x 10" glossies from their phone images? Most end up in some digital format and even a 1080p, that's a mere 2 MP.
Got the pixel picture now?
As far as HD vs. 4k ... let's make that another discussion.
