RE: [Algorithms] FPS Questions
Brought to you by:
vexxed72
|
From: Stephen J B. <sj...@li...> - 2000-08-01 17:33:40
|
On Mon, 31 Jul 2000, Graham S. Rhodes wrote:
> I like Jason's post. It is fairly consistent with my own observations, which
> I describe here. He may not agree with my whole discussion, though.
This stuff is well researched and understood.
Flight simulator people have been aware of the issue for 20 years.
> The old "fact" that 25-30 fps is enough for animation that appears
> continuous to the human eye was discovered when viewing movies of fairly
> slow moving objects. Slow compared to FPS games during attack sequences, for
> example. It just happens that 60 fps or 100 fps or 25 or 30 fps just works
> out nicely for most current games.
There is also a BIG difference between 24Hz movies and (say) 30Hz video.
In a 24Hz movie, each image is only drawn once - so there are 24 separate
still frames.
<apologies to people who've heard me explain this many times before>
In a 30Hz video animation on a 60Hz CRT, each image is drawn TWICE by
the CRT - so there are 60 separate still frames - with two consecutive
identical images being painted onto the phosphor between each swapbuffer
call.
That distinction might not seem to matter - so you'd expect 30Hz graphics
to look better than 24Hz movies - however, they don't - and here is why:
Our brains evolved for tasks like watching a small furry animal running
around in a forest - then letting us throw a rock at it and stand a good
chance of hitting it.
That means that when the cute bunny runs along and tree trunks, bushes,
etc get between it and us, we have to mentally interpolate it's position
in order to fill in the gaps in the imagery coming from our eyes. If
your brain didn't do that, you'd think that you were seeing a set of
separate disconnected events - and throwing a rock would be impossible.
That hardwired interpolation is what screws up our perception of 30Hz
animation on 60Hz video.
Look at a graph of position against time for an object moving
at a constant speed - displayed using 30Hz graphics on a 60Hz video
screen:
|
| . .
^ | . .
| | . .
posn | . .
| . .
| . .
| . .
|____________________________
time ->
Linear motion - but two consecutive images at each position - right?
Well, when your brain tries to interpolate between those still
images, it tries to make a straight line through the points,
but it can't - it's a stair-step function.
However, one way to view this graph is as TWO parallel straight
lines. You can easily draw two parallel lines through those
points - and they fit the data perfectly.
That's what your brain does. So you don't see ONE object moving
jerkily - you see TWO objects moving smoothly but flickering
at 30Hz. This means that all fast moving objects double-image
at 30Hz - and they flicker too.
When the graphics are only updated at 20Hz, you get triple-imaging,
as you'd expect. But there comes a point at poor enough frame rates
at which your brain accepts that this is jerky motion and not
multiple moving objects.
For me, that sometimes happens at 20Hz and sometimes at 15. It seems
to depend on the ambient lighting. Somewhere around that speed, I can
sometimes 'flip' my brain between seeing multiple images and jerkiness
by concentrating on that - just like you can make some optical illusions
flip between two states by staring hard at them.
Different people hit this effect at different speeds. One of my co-workers
can see quadruple-images at 15Hz - I've never seen that.
In a movie theater, each image is only painted once - so a perfect
straight line can be drawn through the points and no double-imaging
is aparrent - although the flicker is bad and the heavy motion blur
can get ugly.
> Does anyone have any reference that states that the human eye *cannot*
> resolve *much* better than, say, 100 fps?
No - to the contrary. There are people who can resolve much better than
100Hz. If you run a CRT at 120Hz but update the graphics at 60, the
double imaging comes back. That proves that your eyes are still seeing
a non-continuous image - even if your higher cognitive centers don't
notice it. It would be interesting to do that with 200Hz video and 100Hz
rendering - but I don't have a CRT that'll go that fast.
I suspect the limit would be the persistance of the phosphor rather than
a limitation of the eye/brain. All those neurons are firing asynchronously
so there will always be some that see the interruption to the light.
The idea that the eye/brain somehow doesn't see the black periods between
the redraws at over 20Hz is a fallacy. What happens is that the
interrupted image is reconstructed into smooth interpolated motion for
your higher level cognitive functions...but when (redraw rate != video
rate) your mental interpolator gets confused because it evolved to
throw rocks at rabbits running behind trees.
Obviously humanity may sometime evolve to be able to interpolate 30Hz
images - but that presumes that people who play a lot of video games
successfully will be more likely to pass on their genes to the next
generation. The reality is probably the opposite of that! :-)
> What happens in 5 years when we all have monitors running at 2000 pixels
> horizontally? Large immersive displays such as CAVE's already run at 96
> frames per second at that resolution... What happens when we are all using
> stereoscopic hardware all the time, so that each eye sees have the total
> frame rate. 100fps becomes 50fps. We will then need 200fps to match today's
> 100fps..... Food for thought.
No - that's not true. We've run 3500x2200 pixel screens and 60Hz still looks
just as good as it does on a 1000 pixel screen.
Steve Baker (817)619-2657 (Vox/Vox-Mail)
L3Com/Link Simulation & Training (817)619-2466 (Fax)
Work: sj...@li... http://www.link.com
Home: sjb...@ai... http://web2.airmail.net/sjbaker1
|