A Question About Frames Per Second


jimid2

Member
Joined
Jan 28, 2007
Messages
236
Location
Canada
Website
Visit site
I'm genuinely curious about this... I've been reading a lot of material on this site about various EMU developments and one of the things I've noticed is that 60 FPS is sort of bandied about as a standard for video speed in emulators. Now, I don't know anything about games programming, but I was just wondering why the bar is set so high? I know that films are generally projected at 24 FPS and I don't notice any "jerkiness" with them, as a rule. Is 60 FPS a standard display rate for console game systems in general? Could someone who understands the display issues bring me up to speed on what's what here? Please? :D
 
Probably it is because of the NTSC standard, which has a refresh frequency of about 60 hertz. But it's just a guess.
 
Films tend to have "motion blur", which makes 25 frames per second look fluid, but in games there is generally no motion blur, so for the animation to not look jerky, a higher frame rate is needed. As there is no kind of blending between the frames, the brain tends to pick each individual frame out more easily.

As far as I know, that is the main reason; I could of course be wrong...
 
Goobers posted on Mar 26 2007 at 09:31 AM said:
Films tend to have "motion blur", which makes 25 frames per second look fluid, but in games there is generally no motion blur, so for the animation to not look jerky, a higher frame rate is needed. As there is no kind of blending between the frames, the brain tends to pick each individual frame out more easily.

As far as I know, that is the main reason; I could of course be wrong...

You're asking about the difference between games update rate, and the screen rate.

ie: The screen refresh is usually 50 or 60Hz (or more nowadays), depending on the country et al.

A game might be written to refresh (say) 20-30 times per second, but the _hardware_ is refreshing more than that. Further, many games _do_ update 60 times per second :)

Remember, you don't emualte a _game_, you emulate _the hardware_ and the game merely runs on it; for many/most games and embedded apps many of the timers or calculations are based on the refresh clock, since you know its a constant. (ie: Many machines didn't have a "clock" in it per se. ie: Space Invaders will run faster or slower based on the screen refresh.. it just gets interupted every so many clocks, and the game knows "move our guys over a pixel every XX clocks", that sort.)

So we emulate the hardware, the game runs in it. So we want to get 60 fps (or 50fps), and the game can update inside of that however often it needs.

jeff
 
Last edited by a moderator:
Goobers posted on Mar 26 2007 at 10:31 AM said:
Films tend to have "motion blur", which makes 25 frames per second look fluid, but in games there is generally no motion blur, so for the animation to not look jerky, a higher frame rate is needed. As there is no kind of blending between the frames, the brain tends to pick each individual frame out more easily.

As far as I know, that is the main reason; I could of course be wrong...

actually, i can spot 24fps pretty easily (that's film, i think 25 is for PAL video tapes or something? ugly american, sorry.) it's just one of those things that makes you feel like you're at the movies :)

motion blur has more to do with film speed than frame rate. but you are right in saying that where there is motion blur, it tends to lead into the next frame.

very often with older systems the sprites were drawn to the display at 60fps, or for a pal tv, 50fps. so the pixels are set to scroll based on that speed. lowering the frame rate in an emulator can lead to choppy movements, or animations looking weird. but it depends on the system and also on the game (platformers get unplayable quickly with frameskip, while rpgs you only need a frame every couple minutes to make sure your guys aren't dying...)
 
Last edited by a moderator:
Thanks for the responses everyone... Motion blur in film vs. changing static images that are video game frames makes a lot of sense, and what skeezix said about emulating the HARDWARE turned on a light. It makes a lot more sense to me now... :)
 
i agree, most if not all games are playable at ~25fps, even ones that require a relatively fast screen update like mario kart

even playing fps games on my pc (which almost always has an aging video card) if i can get 30fps, i can't tell the difference between 30 and 130

50/60fps is more an emulation accuracy thing, for pal and ntsc standards
 
Super Jamie posted on Mar 27 2007 at 05:23 AM said:
i agree, most if not all games are playable at ~25fps, even ones that require a relatively fast screen update like mario kart

even playing fps games on my pc (which almost always has an aging video card) if i can get 30fps, i can't tell the difference between 30 and 130

50/60fps is more an emulation accuracy thing, for pal and ntsc standards

fps games (actually most games that have a 3d renderer) are a whole different animal. the engines are built to be frame-rate independent, so if your character needs to move 1 meter in 3 seconds, it doesn't matter if you see 180 frames in that span, or 90. if you check your guy in 3 seconds, he will be where he needs to be. even if the frame isn't drawn, collisions and other physics should still behave normally.

but older 2d systems actually use the screen refresh to get things done, especially the NES. and even if they didn't, the artists & programmers for those games still designed animations and movements with the 50 or 60 fps in mind (often they'd do NTSC first, and then just tweak the speed of the whole game for PAL, with interesting results).

30 frames is usually acceptable to most people, most of the time. but for older game consoles, think about what happens when get hit in, say, megaman, your sprite "flashes" (only gets drawn maybe every other frame) and if the frameskip is synchronized with that flashing, the effect is lost.

just to make things a little weirder: NTSC is actually 29.97 full frames a second. the refresh rate is about 59.94 fps, but only half the frame gets drawn. which is why small text looks like garbage on tv, it flickers like crazy.

anyone ever made video feedback?
 
Last edited by a moderator:
Which makes me think, why on Earth haven't I ever seen a computer game that properly implemented motion blur? It's as simple as convolving what you want to blur with a friggin line! (yup, a simple line you made with a bresenham algorithm will do). Some fake motion blur by using ghosting (think GTA:III), some even render more images and put them together, how retarded! OK convolution has a cost, but when 25 motion blurred FPS look better than 60 plain old FPS you might wanna consider this.
 
hi jimid2!

That's because most consoles are NTSC systems (30 interlaced frames per second, what is the same as 60 expanded-to-progresive frames)

Hitting 60 frames per second means that the emulator is able to draw all the frames and that the host machine has enought power for that. Is not an "standard" for emulation measure, is just emulating the reality of the machine. It has no sense to emulate at 70 FPS or 80, only if you want 120$ speed for more fast gameplay ;)

And it's not just 60 FPS the "measure"... For example, in CPC the real maximun FPS is 50 because is PAL based ;) (Well, the real CPC could be overclocked from 50hz to 60hz because there was an USA model.. but that's other story...)
 
A_SN posted on Mar 28 2007 at 03:38 AM said:
Which makes me think, why on Earth haven't I ever seen a computer game that properly implemented motion blur? It's as simple as convolving what you want to blur with a friggin line! (yup, a simple line you made with a bresenham algorithm will do). Some fake motion blur by using ghosting (think GTA:III), some even render more images and put them together, how retarded! OK convolution has a cost, but when 25 motion blurred FPS look better than 60 plain old FPS you might wanna consider this.

convolution has a huge cost because most hardware and software are designed to render polygons and simple light sources. however, i do see effects like those creeping into some games and they are starting to look better. i'm a sucker for bloom lighting myself. image-based light maps as well. but doing it on polygon pushing hardware is like trying to hammer a screw.

actually motion blur is, imo, probably the biggest difference visually between 3d animated movies and real-time 3d like in video games. but give it a few more years, it can only get better. until then the closest we'll have is ghosting, unfortunately.

or get a PSP :D
 
Last edited by a moderator:
Back
Top