Hey, not trying to be argumentative here, just discussion
As far as the eye and fps here, the picture industry does not use 30 still frames, they use blur which is the reason why transition is smooth at lower framrates. I might be mistaken but our phones do not.
There are a lot of factors in play on how to make low fps look good, but that doesnt apply to phones.
Sorry if I was touchy.
Totally agree on fps in the sense that it's not simple, and the whole 30 fps thing is chock-full of oversimplification - and utter nonsense from our HTC/Sprint uberlords.
NTSC TV, DVDs and 1080i are broadcast / display media that operate by updating
fields - the actual pictures are assembled by our eyes.
Film is indeed 24 still frames. HOWEVER: it is shuttered by the projector to 48 Hz, and in some esoteric theaters, 72 Hz.
That's not arguing with you, even though it might seem so at first - it's supporting you.
Because we also have a flicker fusion threshold, two things come into play - we MUST have some frame/field locking for very-small-times or all we would see is blur and - this is most relevant - we need to have those still images flicker at some minimum rate or it indeed appears jerky.
24 fps - viewed raw - looks jerky as all get out. I mean, if you've ever seen it, you know - it's abhorrent to everyone.
So, they project 24 fps by shuttering each frame twice - 1/48 second twice for each film frame.
THAT is what begins to give the appearance of smooth motion - and many of you already can probably report seeing that as kinda jerky looking in the theater. You may be among those that - if you go back far enough in wrinkles
- can recall computer monitors REALLY looking better when they went from 60 to 72 or even 96 Hz.
The flicker fusion threshold values were established in the 1800s using a spinning disk!!!
A lot of sports programming is 720p/60Hz - as in - 60 fps and progressive - full frame rather than field.
So - the whole "eye can see only 30 fps" thing - that's not a fallacy - it's so far to the right of fallacy that we might as well call it that far to the right of Hitler - it's that wrong.
How then, are we able to enjoy Blu-ray, DVD, or HD TV today at 24, 30 or 60 Hz?
Strobed lighting.
For phosphor technologies such as CRT or plasma, the phosphor has a natural decay and it's refreshed - those "subfield refresh rates" on top line plasmas are all about the flicker. 480 Hz rates are not uncommon.
Despite specs to the contrary, LCDs do not refresh - the LCD panel
updates and by a convention now biting their butts, the manufacturers agreed years ago to just call the update rate the refresh rate. An LCD subpixel element is simply a electrochemical VALVE - it's an aperture, or shutter - with a colored lens (red, green, or blue) in front. Behind it is a
backlight and regardless of whether CCFL or LED type - the backlight on LCDs flicker to beat the band - just at a super high frequency (no, 120 or 240 Hz, that's more marketing muffing up science - LCD backlights flicker probably in the 300 Hz or better range, depending on make).
AMOLED - haven't bother to check how, but probably a most similar control circuit to that used for the LCD's LED backlight.
So - the eye can only see 30 fps?
If you ever saw 24 or 30 fps it would look like a jerky old flick from Hollywood's silent era.
PS - ABC pioneered 720p/60 Hz in the pre-mandate, experimental HDTV days. (I watched their first national broadcasts - I'm an HDTV early adopter. Can you tell?
)
They had the most
marvelous website calming explaining why - as a sports-oriented network - they first closely studied and THEN selected 720p/60 Hz.
As 1080i vs. 720p became a marketing debate, they removed that website, chock full of engineering details and perceptual study results - marketing can't defeat physics, but they can sure pull its budget!
I continue to search the webarchive for it, but that's taken many, many hours and it was a cul de sac on their site. (Not so much babbling here as hoping that somebuddy reads this and will turn me on to the archive if they know about it.)
Thanks for reading this far, here's the bottom line of ABC's study:
720p vs 1080i at 60 Hz (US broadcast) approximately boils down to the same thing (hence, the encompassing ATSC standard): both are roughly broadcasting 60 megapixels per second.
But the studies - based on physics, not focus groups! - clearly showed that viewers would prefer 1080i - with a maximum perceivable full-frame rate of ~30 fps - for highly detailed, slow-moving frames.
For anything with fast motion, 60 fps was determined to be absolutely essential.
They also recognized that the 720 would be ultimately lower resolution than 1080, no matter how you sliced it.
They therefore mandated, as a corporation down to their affiliates, that all such broadcasts be end-to-end supported by 720p machinery - from the cameras on down - no changes, no other conversions - and most importantly - that all such components be certified color accuracy and fully calibrated prior to each broadcast. In their own words - it was already well established that color and frame rate far outweighed raw resolution for fast motion and TV picture quality.
I can almost guarantee that was about 8 years ago.
Wonderful study.
I do have the physics investigations from Bells Labs on how the eye perceives resolution as a function contrast for moving pictures - I caution that that's for those with a physics background only, but if interested, I'll provide links. It's from the 50s - and boy, was I surprised to find that none of that was old hat.
Here's the thing - you have to be an SMPTE member to easily access their guarded info - and that ain't cheap. Further, they speak in the language of their craft - TV engineering terms. I took a month just to translate one of those monologues (a private gift) into physics-based language that I could follow - and I needed (yet another) TV engineer's help to even do that.
Those guys really know what they're talking about - I'm just doing my best to pass some of it on, because like so many engineers, they don't know how to talk to those of us who haven't studied their craft. Pity.
Final edit, I promise!
For me, 30 fps is just fine. I'll be happy with that and H.264 movies.
For many others, 30 fps is not fine, they'll be happy with 60 fps for animation and games.
We're different audiences and we're both right.
If there is any way that it's a performance trade-off, then I advocate and support a Settings option.
Neither group should suffer for the other - we're all in this together, we're all EVO consumers.
Unlike others, I do not associate the input /response lag with 30 fps - and yes, I've seen the vids. It's far worse at 30 fps, I agree, but it looked poor to me at both rates in the XDA videos on this.
This is a separate issue in my mind, entirely. Again, not being a gamer, I don't care - but I've read that many enjoy lower lag on their Dialer One app - so I'm hoping lag will someday be addressed as well.