• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Camera, Camera, Camera

Supraman21

Well-Known Member
Nov 2, 2009
126
45
Chicago
Well im a media freak and when I heard the Droid was able to record D1 video I got verrrry excited but then I heard it only records 24fps. Is this a hardware limitation or software cause the iPhone has roughly the same hardware as in cpu gpu but has a 3 meg camera and records 30 fps with the best audio ive ever heard coming from a cellphone video camera. I know the droid records at 720 by 480 and the iPhone at 640 by 480 but that extra 80 lines of res shouldnt lower the frame rate that much. The Instinct HD (5meg) can record 720p video at 30fps and D1 at 80fps so im pretty sure its a software limitation. So my question is can someone create a Video camera app that can change the video to 30fps or more if possible? Ohh and also is the screen glass or plastic?

Im posting this everywhere cause I really would like an answer.
 
When the human eye can only detect speeds up to 25 frames per second, you'll only notice the difference when editing the video. Film (ie: movies... but you already know this) is shot at 24.97fps. Since it is only a phone with a built in video camera, should it really make that much of a difference? Have you seen the test footage posted on the net that was taken with this camera? Personally, I was quite impressed considering it is a... phone.

I'm a media freak too... I own a small video production company, and am very heavily into videography and editing.

That said... I don't see much of an issue here.
 
Upvote 0
Id really have to disagree with the eye only seeing 25fps. The human eye doesn't have a shutter so we see continuous light. For example why is there such a big difference watching normal tv then turning on 120hz. I wasn't very impressed with the video cause I now base all cell phone video with the iPhone 3GS cause there is zero pixelation the transition from light areas to dark is very fast and the sound is absolutely amazing including the 30fps
 
Upvote 0
And if you ask any gamer, many of them will say the same thing - it looks better with the exact same settings with a higher frame rate.


That would be because frame rate is directly linked to video cards and lag on games. You shouldn't confuse the two. :) It is also much more noticeable due to the way it occurs. Frame rates are averaged in one second increments on a game display. That doesn't mean it is a steady frame rate. You are going by the assumption that the frames will be displayed like this:

|||||||||||||||||||||||

Each line representing a frame. In the world of high-res video games on laggy video cards, it is not. It would look more like this...

||| ||||| || |||| |||

It is the gaps between frames (the lag) that gives the gamer the perception of freezing the frames. It isn't the slower frame rate he sees... just the gaps from the lag.


When discussing video from movies and cameras, frame rates are a WHOOOOLE different beast. My games seldom drop below 200 fps, and when they near 150 fps, believe me... I notice. Do you think I could tell the difference between 200fps and 150fps on a video camera? I would argue to the death that no one can.
 
Upvote 0
That would be because frame rate is directly linked to video cards and lag on games. You shouldn't confuse the two. :) It is also much more noticeable due to the way it occurs. Frame rates are averaged in one second increments on a game display. That doesn't mean it is a steady frame rate. You are going by the assumption that the frames will be displayed like this:

|||||||||||||||||||||||

Each line representing a frame. In the world of high-res video games on laggy video cards, it is not. It would look more like this...

||| ||||| || |||| |||

It is the gaps between frames (the lag) that gives the gamer the perception of freezing the frames. It isn't the slower frame rate he sees... just the gaps from the lag.


When discussing video from movies and cameras, frame rates are a WHOOOOLE different beast. My games seldom drop below 200 fps, and when they near 150 fps, believe me... I notice. Do you think I could tell the difference between 200fps and 150fps on a video camera? I would argue to the death that no one can.
That explanation seems like BS because you assume the video cards are lagging... why assume that?

I think framerates below 30 are pretty obvious when panning quickly or viewing any fast motion in games. Also, what monitor / system / game are you using that even supports 150+ fps output?
 
Upvote 0
That explanation seems like BS because you assume the video cards are lagging... why assume that?

Video/graphics lag is the second largest cause for lag when playing video games. That is fact. The first is a slow internet connection. In either event, the description of the lag above holds true. That is why its so noticeable.


I think framerates below 30 are pretty obvious when panning quickly or viewing any fast motion in games.

Again, read a few posts above. The human eye can easily detect framerates around and below 30fps. You will get no argument from me on that statement at all.


Also, what monitor / system / game are you using that even supports 150+ fps output?

Most games "support" that high of a frame rate output. I play a lot of flight simulators and 1st person shooters such as Crysis and Call of Duty. If you run fraps while playing them, you can see what your frame rate is.. but the frame rate will drop due to CPU usage, but the amount it drops varies. In low end PCs, it will kill your frame rate (due to CPU issues), but the higher end computers don't usually take it as poorly.


My system specs:


Manufacturer:
N/A (Custom built)


Processor:
AMD Phenom(tm) II X3 720 Processor (3 CPUs), ~3.9GHz


Memory:
4096MB Kingston Hyper-X 1066


Hard Drive:
1.75 TB


Video Card:
NVIDIA GeForce GTX 260 (GPU #1)
NVIDIA GeForce GTX 260 (GPU #2)


Monitor:
Hanns-G 22" Widescreen (1680 x 1050)


Sound Card:
M-Audio Audiophile 2496


Speakers/Headphones:
Ultrasone PRO 750 Headphones


Keyboard:
Logitech G-19


Mouse:
Logitech MX-518


Mouse Surface:
N/A


Operating System:
Windows 7 Ultimate 64-bit (6.1, Build 7100) (7100.winmain_win7rc.090421-1700)


Motherboard:
Gigabyte GA-MA790x-UD4P


Computer Case:
Antec P180





In any event, as I said before... lag from video games and low FPS by design on video cameras is a whole different type of frame rate, so why are we discussing video games so in-depth here?
 
Upvote 0
Your specifications are similar to mine:

Home built machine dubbed The Beast (IV):

eVGA 780i motherboard
Intel Core2Quad 6600 / 2.4 GHz OC'd to 3.21 GHz
OCZ ReaperX HTC PC2-8000 RAM
Dual Seagate 500 GB 7200 rpm 32MB Cache HDs
WD 250 GB 7200 RPM 8 MB Cache
eVGA GTX 260 (192) - Base clock, OC'd to SSC
eVGA GTX 260 (192) - Stock @ SSC clocks
Dual Acer X213w monitors @ 1680 x 1050 res / 59 Hz ref (each plugged into its own GPU)
SLI enabled 24 / 7 / 365

And you're right - we can easily discern at lower frames per second, but if I had a game that I was pumping past the 100 fps mark I doubt I could easily discern the difference between 150 and 200.
 
Upvote 0
Your specifications are similar to mine:

Home built machine dubbed The Beast (IV):

eVGA 780i motherboard
Intel Core2Quad 6600 / 2.4 GHz OC'd to 3.21 GHz
OCZ ReaperX HTC PC2-8000 RAM
Dual Seagate 500 GB 7200 rpm 32MB Cache HDs
WD 250 GB 7200 RPM 8 MB Cache
eVGA GTX 260 (192) - Base clock, OC'd to SSC
eVGA GTX 260 (192) - Stock @ SSC clocks
Dual Acer X213w monitors @ 1680 x 1050 res / 59 Hz ref (each plugged into its own GPU)
SLI enabled 24 / 7 / 365

And you're right - we can easily discern at lower frames per second, but if I had a game that I was pumping past the 100 fps mark I doubt I could easily discern the difference between 150 and 200.

Very nice system. My GTX cards are 216 instead of the 192. The newegg salesman talked me into springing the extra money. Personally, I still can't see the difference between the two, though.


And as for the ability to detect anything from 150-200 fps... I would say that depends on where the lag occurs and why. When you are dealing with frame drops due to internet connection (or server lag), there is no telling.

I cant wait until the day game servers and connections are so reliable that you can play with 30+ people on the same server and get zero lag. There was some country in Europe that was experimenting with 40gb/s optic internet wire, and having great success. I'd say it's only a matter of time. :)
 
Upvote 0
I have to fully agree with Fadelight on this one. There is a difference between game FPS and video FPS. What you guys see on fast moving objects or when panning is not the lack of frames, it's the lack of information that can be carried over from frame to frame due to the compression codec used. I argue that you will not be able to see a difference between a 25 or 30 or 60 frame video IF THE VIDEO IS UNCOMPRESSED, means full frames.

What makes it look better on TV when they sell you 120 hz tech is basically they show each picture twice (on a 60 frames source, so that's 60+60). It makes the picture look smoother but that is due to the display technology used (and the lag on displaying new data there) and again the compression codecs used to carry your picture information.

If you have the proper software and system, just recompress some video you shot at 30p so it uses a raw full frame codec. then recompress it at 24.97p as well (again raw full frame) - compare the two pictures on the same screen. You won't see ANY difference at all.

I always edit full frame. When you record that stuff, there is a difference between 24p and lets say 60. But that is only because of the codec used to compress the pictures. So it is rather a codec issue than an issue 24p vs. 60p. If the video gets a higher bitrate, then the AVC codec can store more information between frames and then you won't have the panning issue, even at 24.97 frames. 60p on a cell or portable video device will always "look" better because one second of video is sliced in 60 pieces rather than 24. however, that is due to the compression, not due to the fact that 30 or 60p is better than 24p. That's just not correct as Fadelight noted already.
 
Upvote 0
I bought my cards before the 216s were available for sale. I originally built the machine with an 8800GTS, also eVGA. I then purchased a brand new 260 from NewEgg, took out the 8800 and used the eVGA step-up to get an additional 260. I was pleasantly surprised to get the SSC clocked card from eVGA (the purchased card is a base model) and using eVGA's Precision software I have both synced at SSC stock clocks.

Since the card from eVGA was supposed to be a base model, and the serial # confirms it is a base model card, I think that they simply flashed the card's BIOS with the SSC BIOS for whatever reason - and thus I got a nice surprise when plugging it in.

Also, I don't play many games online at all - most of my games are single player / store mode / arcade mode type games (GRID in solo non internet based mode, Mass Effect, etc.) so I rarely ever get any sort of lag due to my internet connectivity (or lack thereof). Occasionally I do play online MMORPGs, but I expect a reasonable slowdown due to intermittent data outages and virtual re-routing. But I agree that internet connectivity and such will play a role in all online games.

In fact, those of you who are fans of Speedtest.net may not know this yet, but they have a new Pingtest.net site that checks your line quality - it is worth a look at even though it is still beta, and the info is still basic at this point.... see http://www.pingtest.net/

Anyway, getting back on topic, I really don't see the problem with the limited fps of the video footage with this phone, but I do have 1 question - is this a hardware limitation or a software limitation? I (and I bet most people who now own the DROID) would rather it be the latter...for that can be fixed at some point....

However, I fear it is the former, as I am pretty positive that they would not make it software dependent knowing we could easily bypass the restrictions once we have rooted the phone...of would they?
 
Upvote 0
If you have the proper software and system, just recompress some video you shot at 30p so it uses a raw full frame codec. then recompress it at 24.97p as well (again raw full frame) - compare the two pictures on the same screen. You won't see ANY difference at all.

Think you meant decompress the first time, right?
 
Upvote 0
What makes it look better on TV when they sell you 120 hz tech is basically they show each picture twice (on a 60 frames source, so that's 60+60). It makes the picture look smoother but that is due to the display technology used (and the lag on displaying new data there) and again the compression codecs used to carry your picture information.



Actually, you are mistaken. There has always been a lot of confusion between what you are now talking about and FPS, but they are different.

120hz means 120 Hertz... 120 cycles per second. It is a pure electricity aspect of screens. I'll try my best to explain in simple terms so as to not confuse anyone.

Household electricity operates between 50 and 60hz. This means that in 1 second (the standard time measurement), the electricity changes direction 60 times. This is why sometimes you can see bulbs (such as flourescent) "flicker". It is because... they are. At times, the hz drops so low that the human eye can start to detect when the bulb goes out. Think of it as the same basic principle as cartoon animation. Go fast enough, and you can no longer tell. Same thing.

Anyway, on TVs set at 120hz, this means the same thing... the electricity changes direction 120 times in 1 second. The tv lights flicker on and off 120 times in one second. That is why, while virtually unnoticeable, lower frame rates will start to give you headaches on computer monitors.

It is definitely some interesting reading if you start searching around for information.

But it is also absolutely nothing related to frames per second.
 
Upvote 0
I don't know if this has been brought up yet (didn't read everyone's posts), but from a biological standpoint the human eye can't really differentiate anything past 30 FPS.

One of the reasons it's good for a game to run at a higher frame is because if there is a bit of a hiccup and the frame rate drops briefly you won't see the difference (e.g. running game at 60FPS hit a lag spike and it drops to 45 FPS, you won't notice). But if you're running things at 30FPS and hit the same lag spike or something and you drop by 10 frame rates you'll notice the difference.

That being said, I doubt there is much of a difference between 24FPS and 30FPS in terms of what you can see.
 
Upvote 0
I don't know if this has been brought up yet (didn't read everyone's posts), but from a biological standpoint the human eye can't really differentiate anything past 30 FPS.

One of the reasons it's good for a game to run at a higher frame is because if there is a bit of a hiccup and the frame rate drops briefly you won't see the difference (e.g. running game at 60FPS hit a lag spike and it drops to 45 FPS, you won't notice). But if you're running things at 30FPS and hit the same lag spike or something and you drop by 10 frame rates you'll notice the difference.

That being said, I doubt there is much of a difference between 24FPS and 30FPS in terms of what you can see.


You just 100% backed up everything I've been saying this whole time. :D


You'll notice a bit of difference between 24 and 30, though. That is one of the reasons why movies look like... well... movies, and videos do not. The frame rate on movies is just a hair slower than the maximum the human eye can detect. It gives it a more "surreal" feel.
 
Upvote 0
Actually, you are mistaken. There has always been a lot of confusion between what you are now talking about and FPS, but they are different.

120hz means 120 Hertz... 120 cycles per second. It is a pure electricity aspect of screens. I'll try my best to explain in simple terms so as to not confuse anyone.

Actually you are not right about this as there is no direct connection between an LCD display and the Hertz on the electricity that is supplied (plasma or crt or even rear projection is a different story though). The picture is built off its OWN clock thus is totally independent of the grid's frequency. This matters on interlaced pictures: so 60 hz basically give you 60 half pictures (for a total of 30 pictures per second). On progressive scan, the whole picture is transmitted and not just half of it. So 60p (60 progressive) is certainly better than 60i (cause 60i would be theoretically only 30 pictures/second - plus you get a jitter effect - that is where 120hz really makes a difference on interlaced material). So what any LCD othese days does now is interpolating the picture (to reduce motion blur; some use smoothing as well) and on 120HZ technology showing each picture twice (twice as fast) which then results in a visually more appealing picture quality due to LCD/Plasma refresh times involved. On CRT technology, 120 hz is a different story as the electron beam was going TWICE as often from top to bottom. So the refresh rate if you will was twice as fast. Hz on an LCD TV just means how often the picture is refreshed/updated per second and in order to do that right now ALL LCD systems just double the frames (frame copy through frame buffer) and display them twice as fast. So instead of 60 frames, they display 120. HOWEVER those 120 frames are still displayed within 1 second (just twice as often as the source material). As motion prediction and interpolation creates a whole set of new issues when it comes especially to hi-def video quality, right now most manufacturers display source material with buffer frames instead of interpolated frames (high end sets apply more video processing for better looking picture). That goes for 120 or 240 hz alike. That is EXACTLY the issue WHY current sets won't be able to display 3D Blu-ray material properly because 3D video needs anything 120HZ or above BUT REAL 120 frames per second, not 60+60 (copies).

There is indeed a lot of information about this online and I can point you in the right direction if you are interested. So keep in mind that LIGHT source and picture creating source is the same on a CRT, but not on an LCD. Those are completely independent from each other and when we talk about HZ on an LCD then we do not talk about the clock frequency of the light source, we talk about REFRESH RATES of the LCD.
 
Upvote 0
Looks like this is the right crowd for my question! Basic stupidity, really. Did a nice video, and held it sideways, so now video is 90 deg. off. What programs do you experts (and I mean that literally) prefer to: rotate .3gp files, and: rip DVD's to compatible format/size for our Droids. Thx!

(And IMHO, the outdoor video and pics are awesome - interior shots suck a$$!!!)
 
Upvote 0
Yes I agree, when you get to those low-ish numbers then even small differences are discernible.

But yeah, while neurons can conduct signals at much higher rates than what would match 30 FPS, the actual mechanisms behind vision are slower. Even though we are getting continuous light (as someone said above), the cellular mechanisms behind sight put a limit on how quickly your vision can process things. Hence why shows such as Time Warp (discovery channel) where people use high speed cameras to slow stuff down are so cool....there is a ton of stuff our eyes miss. Most of it is unimportant though :D
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones