Frame rates on console games...how can they tell

Discussion in 'Archived Threads 2001-2004' started by MikeAlletto, Apr 28, 2002.

  1. MikeAlletto

    MikeAlletto Cinematographer

    Joined:
    Mar 11, 2000
    Messages:
    2,369
    Likes Received:
    0
    Trophy Points:
    0
    This is one thing thats been bugging me for awhile. People say, "its got a constant frame rate of 60fps" or "it drops below 30, then back up to 60, then back down again".

    How do they know? Is there some hidden code on every single game to display frame rate? How about those games where they say "it stays right at 60fps for the entire game".

    Sounds like a lot of bs to me. I know with pc games most of the time you can get the frame rate displayed, but I have never ever seen an option to display it on console games. So are all these people blowing hot air like I suspect just to make themselves sound all high and mighty because they can't find anything else to talk about?
     
  2. Morgan Jolley

    Morgan Jolley Lead Actor

    Joined:
    Oct 16, 2000
    Messages:
    8,818
    Likes Received:
    122
    Trophy Points:
    9,110
    Sometimes there actually is a way to see the fps on a debug console (I think) but generally you can tell. If the game runs super smooth, then on consoles its generally 60 fps. If it drops down to a slightly choppier framerate, then its probably at 30 fps or less.
     
  3. Steve Deacon

    Steve Deacon Stunt Coordinator

    Joined:
    Sep 15, 2000
    Messages:
    71
    Likes Received:
    0
    Trophy Points:
    0
    It's easier to judge the frame rate of a console game because they're almost always locked to a vertical blank, so you get 60/30/20/15 frames per second. I've seen a few PSX titles that didn't sync (and worked on some), but the only one on PS2 that I've seen is MGS2, but only in certain spots, like outside in the rain on the tanker section.

    It's pretty easy to spot when a game isn't in a frame, usually you'll see a slight ghosting on movement, ie in an FPS, if you look at a vertical edge and strafe left and right.

    That's the only thing wrong with Halo, well that and the fact that I can't use a mouse/keyboard to control it.
     
  4. Scott Varney

    Scott Varney Agent

    Joined:
    Dec 4, 2001
    Messages:
    46
    Likes Received:
    0
    Trophy Points:
    0
    What's always bothered me about this number was the fact that, at least before digital TV, no TV on the market could display more than 30 frames (60 fields) per second. So, on a standard TV, why would a framerate of more than 30 frames per second be an advantage?

    Thanks,

    Scott
     
  5. Gary King

    Gary King Second Unit

    Joined:
    Apr 13, 1999
    Messages:
    479
    Likes Received:
    0
    Trophy Points:
    0
    Because the game would render in fields, so they actually would be updating the game state (and the rendered output) 60 times/second.
     
  6. Shayne Lebrun

    Shayne Lebrun Screenwriter

    Joined:
    Jun 17, 1999
    Messages:
    1,086
    Likes Received:
    0
    Trophy Points:
    0
    The DreamCast was the first console that started caring about frame rates; on the PS, it was mainly 'is it too damn choppy?' The Dreamcast is actually designed to output to an SVGA monitor, but happens to include the circutry to downconvert to NTSC/PAL. This is as opposed to the PS2, which is designed to output to NTSC, but can be upsampled. This is why games like DoA2: Hardcore looked so horrible; the interlacing.

    As to how they know, it's what they're told to say. If they don't say what they're told to say, they don't get early product to review. If they don't get early product to review, they don't get viewers. If they don't get viewers, they don't get money. If they don't get money, they go out of business. Hence, you get PC Gamer syndrome, where they're giving 96% final scores to games that aren't even in internal beta.
     
  7. Dave F

    Dave F Cinematographer

    Joined:
    May 15, 1999
    Messages:
    2,885
    Likes Received:
    2
    Trophy Points:
    0
    So they're told to say that a game sometimes drops to 30 fps? It's a pretty big generalization to say that it is because it is what they are told to say. It may be true sometimes, but it is pretty easy to tell the difference between 30 and 60fps games.

    -Dave
     
  8. Masood Ali

    Masood Ali Supporting Actor

    Joined:
    Jan 31, 2002
    Messages:
    921
    Likes Received:
    0
    Trophy Points:
    0
    It's extremely easy to tell the difference between 30 and 60 FPS.

    In fact, if you have Gran Turismo 1, unlock the 60FPS high resolution mode and check it out for yourself.
     
  9. MikeAlletto

    MikeAlletto Cinematographer

    Joined:
    Mar 11, 2000
    Messages:
    2,369
    Likes Received:
    0
    Trophy Points:
    0
     
  10. Camp

    Camp Cinematographer

    Joined:
    Dec 3, 1999
    Messages:
    2,301
    Likes Received:
    0
    Trophy Points:
    0
     
  11. Paul Richardson

    Paul Richardson Second Unit

    Joined:
    Jun 25, 2000
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    0
    What's funny is how people are pooping bricks if a game drops to 30 fps. Yet, movies play at 24 fps and I don't see anyone complaining about how "choppy" they are.
     
  12. Gary King

    Gary King Second Unit

    Joined:
    Apr 13, 1999
    Messages:
    479
    Likes Received:
    0
    Trophy Points:
    0
    Movies also have motion blur applied liberally - 24fps movie and 24fps game are two entirely different beasts.

    For console systems, saying a game runs at 30fps versus 60fps is a bit easier than on a PC, since most games are tied to vertical blank periods on the TV, like Steve said. There are some games whose redraw periods hover so close to the full blank period that 1 frame will render in 2 periods and the next in 1 (resulting in an average of 45fps); however, most games don't hover that close to the total blanking period, so a sustained framerate change is noticeable.
     
  13. BrianB

    BrianB Producer

    Joined:
    Apr 29, 2000
    Messages:
    5,205
    Likes Received:
    1
    Trophy Points:
    0
     
  14. Michael St. Clair

    Joined:
    May 3, 1999
    Messages:
    6,001
    Likes Received:
    0
    Trophy Points:
    0
    It is easy to tell by making a couple of VCR recordings, and playing back a frame at a time. It is easy to count fields this way and see how many are distinct in 30 frames. In 60 "fps" interlaced gameplay (example: VF2 for Saturn), the even and odd fields will never match, 60 unique fields per second.
    The number of unique rendered frames can be anywhere between 0 and 60, it does not have to be an even multiple (60, 30, 15, etc).
    It is harder to count frames in 480p, but D-VHS will help [​IMG]
     
  15. Bjoern Roy

    Bjoern Roy Second Unit

    Joined:
    Oct 15, 1998
    Messages:
    315
    Likes Received:
    0
    Trophy Points:
    0
    Mike, Camp,

    its not like that. Like Steve mentioned above, games are locked to the vertical blanking (VB).

    Ideally, your engine is quick enough to handle the whole game code, including rendering the scene, in less than 1/60th a second. The rendering done in a 'back-buffer' that is not visible on-screen. Once the vertical blanking (electron beam is done with the current field/frame and moves to the upper left again, in signal/CRT terms) is reached, the front buffer (the one that is display) and the back buffer (the one that got rendered into) are 'swapped'.

    This means the data that just got rendered will be displayed next, because the back buffer is now the front buffer (and vice versa). Once you swapped the buffers in the blaning interval, you can start rendering the next frame into the back buffer, which is last frames front buffer. And so on and so on.

    As long as the rendering takes up to 1/60th of a second, you will get 60fps. The fact that you wait for the VB means that you can't have more than 60 fps. Even if you finish your game code in 1/10000th of a second you would still have to wait until the VB occurs to swap buffers (the synconisation is done to prevent tearing).

    Now, if your engine is not quick enough to render the scene in 1 frame (1/60th sec), the VB passes and you can't swap, because you are not finished yet. So you need to wait for the next VB. But this means that even if your engine took only ever so slightly longer than 1/60th sec, it would still then only run at 30 fps, because it would have to wait to every other VB to swap buffers.

    So, if you need:

    - less than 1/60th sec to render, you get 60fps

    - 1/60 - 2/60 sec to render, you get 30fps

    - 2/60 - 3/60 sec to render, you get 20fps

    - 3/60 - 4/60 sec to render, you get 15fps

    And so on.

    So you see that a framerate of 45fps doesn't exist. Well, it does, but you have to differentiate 2 terms then:

    Momentary FPS and Average FPS.

    The fps discussed above are only 'momentary'. If you are done rendering in less than one frame, you get 60fps 'momentarily', at that point in time.

    Lets assume you manage to render in 1 frame a few times (60fps momen.) in a row, but then need 2 frames for another few (30fps momen.), and back and forth, you will get an 'average' framerate of 45fps.

    Thats why i don't really like the usage of FPS for the 'momentary' description of the framerate at all. I use the term 'the game runs 1-framed' to describe the fact that engine outputs a new image every frame. This term is thus even independent of the actual refreshrate (e.g. 75hz on a PC, 60hz NTSC, 50hz PAL).

    Now, the next thing to differentiate is 'judder' and 'jerkiness'.

    A) If the game runs 1-framed constantly, its 'smooth' as butter, if it runs 2-framed (like Halo's 30fps) it has a slight 'judder' to it. But even if its running 2-framed its still flowing along constantly, only slighly less 'smooth'. A constant 3-framed output (e.g. 20fps) isn't considered really smooth anymore.

    B) A completely different issue is that of 'studders' or 'jerkiness'. This occurs when 'momentary' framerate changes, e.g from 1-framed (60hz) to 2-framed (30hz) and back again. There is a unavoidable motion-discontinuity when this happens, both when the momentary framerate decreases 60->30 AND when it raises again 30->60.

    This 'judder' is even worse when the momentary framerate drops to very low levels, of course. E.g. when you stream in new data from your medium (or you swap textures etc.) you might need several frames to do so (lets assume 1/4th sec). So your render cadence would be something like 1-1-1-15-1-1-1. that drop from 60fps down to 4fps and back up is going to induce a SEVERE studder effect.

    The important part about this framerate-change discontinuity it is much more annoying as the effect described in A).

    So, although a bit counter-intuitive, a constant 30fps will look a LOT better than an average framerate of 40fps on a 60hz system, because the latter means that the game runs 2-framed most of the time, but potentially jumps to 1-framed about a forth of the time, which means its basically juddering along all the way. At the same time, the constant 30fps will look a tad less smooth than 60fps, but still have a constant fluidity that the other constellation would be missing.

    That means its sometimes better to set an 'upper limit' to the framerate. E.g. if a game isn't capable to run 1-framed (60fps) for a considerate amount of time in a row, it would be better to limit it to 30fps. How is this accomplished? Well, quite easily actually, you simply don't allow the code to swap buffers 2 times in a row. If you are ready rendering in less than 1/60th second, you skip the VB and wait for the next.

    This is a decision that the programmers have to take early on. Its reasonable to assume that Halo, for example, could easily run at a momentary 60fps from time to time. But due to the big render-time fluctuations in first person shooters with long view distance outside scenery, a typical rendertime cadence might be something like this 2-2-3-2-1-2-2-1-2-2-1-4-1-2-1-2. This would mean there are 7 changes in momentary framerate in 16 followup frames of a roughly 30fps scene. This means 7 'judders'. If you limit the rendering time to no faster than 30fps, you get 2-2-3-2-2-2-2-2-2-2-2-4-2-2-2-2 with only 2 judders.

    Oh well, me talking again.
     
  16. Camp

    Camp Cinematographer

    Joined:
    Dec 3, 1999
    Messages:
    2,301
    Likes Received:
    0
    Trophy Points:
    0
    Bjoern,

    Thanks for the informative post. I don't see how that's in response to my post though. ??
     
  17. Paul Richardson

    Paul Richardson Second Unit

    Joined:
    Jun 25, 2000
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    0
    I think that the bottom line is that it's easy to see when a game has changing framerates...like when I train 200 monsters to the town gates in Everquest and everything slows to a crawl. However, I imagine that in a double-blind controlled study, very few people could tell the difference between a game that runs consistantly at 30fps vs. a game that runs consistantly at 60fps.
     
  18. MikeAlletto

    MikeAlletto Cinematographer

    Joined:
    Mar 11, 2000
    Messages:
    2,369
    Likes Received:
    0
    Trophy Points:
    0
    Man Bjoern, I'm not gonna read all that [​IMG] Someone want to give me the Cliff Notes version?
    [​IMG]
    Basically is what you were saying is that its smooth its 60, if its not then its 30 or less. No if, ands, or buts about it?
    Who defines smooth then? What is smooth to me may not be smooth to someone else.
     
  19. Dave Falasco

    Dave Falasco Screenwriter

    Joined:
    Oct 2, 2000
    Messages:
    1,185
    Likes Received:
    0
    Trophy Points:
    0
    Thank you, Bjoern, for that post! Really, that's the best I've ever had framerates explained to me. This statement in particular:
     
  20. BrianB

    BrianB Producer

    Joined:
    Apr 29, 2000
    Messages:
    5,205
    Likes Received:
    1
    Trophy Points:
    0
     

Share This Page