# how many fps can the human eye see?

Discussion in 'Archived Threads 2001-2004' started by Christ Reynolds, Feb 25, 2003.

1. ### Christ Reynolds Producer

Joined:
May 6, 2002
Messages:
3,597
0
Trophy Points:
0
Real Name:
CJ
well, the question is in the title. i remember hearing that the figure was around 30 fps. but if that is the case, how is it that i can tell the difference between 30fps and 45 fps? if i am wrong with the figure, does anyone know the real one?

CJ

2. ### Dave Poehlman Producer

Joined:
Mar 8, 2000
Messages:
3,813
0
Trophy Points:
0
I'm no optomitrist (but I play one on the HTF ), but there may be a couple answers to your question, I think the human eye can detect a flicker under 16 times per second.. above that, it begins to look like steady light. However, duping the brain in thinking its seeing fluid movement requires a much faster frame rate. A good example is to wave your hand in front of your computer screen, the sillouette of your hand will look like you have several fingers.

Now, that's probably a 60Hz flicker you're looking at. It looks like a steady light when you're reading this post but fast movement still breaks up at that rate.

I have no idea how fast it needs to go before the movement looks completely fluid... it probably depends on the speed of the movement. A slow moving object may look fine at film's 24 fps. Maybe someone here can answer that question.

3. ### MarkHastings Executive Producer

Joined:
Jan 27, 2003
Messages:
12,013
1
Trophy Points:
0
It really depends on the motion. Some say that 24fps is a great "Middle ground". In order to see action from video sources like sports events (fast action), some say that a 50-60fps should be used. I've even heard some say that the brain can detect frames at 200-300fps.

This site (I believe) was translated into English, so it sometimes gets a bit consufusing, but the info. seems to be pretty good:
http://www.100fps.com/how_many_frame...humans_see.htm

4. ### Mike Lenthol Second Unit

Joined:
Jul 28, 2000
Messages:
322
0
Trophy Points:
0

5. ### AaronMg Stunt Coordinator

Joined:
Mar 20, 2002
Messages:
247
0
Trophy Points:
0
I think a normal television is somewhere between 28 and 32.

6. ### Kris McLaughlin Stunt Coordinator

Joined:
Jun 5, 2000
Messages:
235
0
Trophy Points:
0
I think a normal hand is somewhere between 4 and 6.

7. ### Shayne Lebrun Screenwriter

Joined:
Jun 17, 1999
Messages:
1,086
0
Trophy Points:
0
The human eye doesn't work in terms of 'frames per second,' so you can't really give an answer.

Film gets away with a low fps count, by the way, due to motion blur. Video games don't do motion blur, and the brain notes that and finds it jarring. This is why 3dFX tried to hype up it's T-Buffer technology instead of just blindly upping the frame rates; would have let them make a 30 FPS image look BETTER than a 200 FPS one.

8. ### Ben_Hud Stunt Coordinator

Joined:
Dec 15, 2002
Messages:
62
0
Trophy Points:
0
as an avid video game player i can assert that without motion blur, on games ex. counterstrike, i can enjoy playing atleast 75-100fps, and it becomes terrible to play at 50fps or less once you get used to more

9. ### Mark Fitzsimmons Supporting Actor

Joined:
Aug 18, 2001
Messages:
539
0
Trophy Points:
0
Wow, this thread really clears some things up. I always wondered how 24fps film looked smooth but the same framerate in a videogame looked terrible.

60hz really bothers my eyes, I like 80hz or above.

Heres a little test you can do. Look up to your cieling while keeping the computer monitor in the bottom of your field of vision. See if then you can detect the flicker.

I cant on my monitor at this refresh, but if I dropped it I'd see it and it'd bother me alot. Can anyone explain how I dont see the refresh as much when looking direcly at the monitor, but I do when looking away?

10. ### Scott L Producer

Joined:
Feb 29, 2000
Messages:
4,457
1
Trophy Points:
0
Several guys hit the nail on the head about motion blur, it's VERY important. It was pointed out earlier that the Final Fantasy movie although 24fps looks great because the CGI artists used motion blur when going from one frame to the next. If they didn't, it would look choppier than GTA3 on the PS2.

One thing I noticed about the plasma they show at Best Buy. I'm not sure how the response time is on those things but although the picture isn't as clear as a CRT it looks 10x more fluid. I'm guessing because a simulated motion blur occurs when the pixels are lighted up.

Have any graphics card makers made any advancements in this area lately?

11. ### Eric Kahn Guest

Current film is filmed at 24 FPS but actually shown in a theator at 48 FPS, the projector shows each frame twice to reduce the percieved "flicker", don't ask me why, I just remember this from a film class along time ago

television is 30 frames per second but is actually 60 fields per second, 2 fields making up a frame, the TV "draws" the even lines in th epicture then comes back and "draws" the odd lines, that is why you hear the term "interlaced" for normal TV pictures

12. ### Yee-Ming Producer

Joined:
Apr 4, 2002
Messages:
4,375
16
Trophy Points:
610
Location:
"on a little street in Singapore"
Real Name:
Yee Ming Lim

13. ### BrianW Cinematographer

Joined:
Jan 30, 1999
Messages:
2,563
38
Trophy Points:
0
Real Name:
Brian

14. ### Chris PC Producer

Joined:
May 12, 2001
Messages:
3,975
1
Trophy Points:
0
Films also cheat to get away with 24 frames per second by flashing the frame twice each pass. Its really 48 frames per second, with every frame shown twice. 24 x 2 as it were.

Joined:
Jan 30, 1999
Messages:
2,563