Thread: Re: [Algorithms] FPS Questions (Page 2)
Brought to you by:
vexxed72
From: Kent Q. <ken...@co...> - 2000-07-31 21:57:54
|
Tom Forsyth wrote: > > Oh yes - on monitor refreshes, 60Hz hurts immediately, <70Hz hurts after a > while, and 85Hz is nice. Ditto. > Incidentally, 24fps panning at the cinema does EVIL things to my eyes - can > no-one else see it? It's really really awful and stuttery and blurry and yuk > - ruins a good movie. Roll on digital projection.... I'm OK if I sit fairly far from the screen, mostly because of motion blur. But put me in an Omni theater (the hemispherical movie theatres), and I have a really hard time with the flickering on the periphery. It's painful. And does anyone else notice when they use the high-speed digital camera on the pitcher in baseball? I can always tell when they switch to that camera, because the motion blur is gone. It looks WRONG. -- ----------------------------------------------------------------------- Kent Quirk | CogniToy: Intelligent toys... Game Designer | for intelligent minds. ken...@co... | http://www.cognitoy.com/ _____________________________|_________________________________________ |
From: Steven C. <sc...@ti...> - 2000-07-31 22:15:41
|
Kent Quirk wrote: > And does anyone else notice when they use the high-speed digital camera > on the pitcher in baseball? I can always tell when they switch to that > camera, because the motion blur is gone. It looks WRONG. What I really hate is when you see, for example, lamp posts flashing past. They just jump from one location to the next. This is really obvious on many sporting events where the camera tracks the competitors, such as running etc... It is so obvious that digital video cameras are being used with very high shutter speeds. There is a lot to be said for old technology :) I hate it. Anyhow, just thought I'd chip in :) Regards, Steve -- Steve Clynes sc...@ti... "Vital papers demonstrate their vitality by moving from where you left them to where you can't find them." |
From: <sro...@te...> - 2000-07-31 20:58:34
|
your right, just with quake3 go in front of a mirror you will see your fps like drop (1024x768x32) from 90 to 60 and just turn see the wall and the fps go 90 fps so getting the more fps as you can can only help your performance Corrosif, "Ignore demands from the marketing department to release premature shots. These people are for the most part clueless, and are only trying to justify their job." George Broussard, President of 3DRealms. -----Original Message----- From: gda...@li... [mailto:gda...@li...]On Behalf Of Mats Lundberg Sent: Monday, July 31, 2000 4:24 PM To: gda...@li... Subject: Re: [Algorithms] FPS Questions Ever played Classic Quake with 30 fps? I have, and you really Feel the difference between 30 and 60 fps, or 30 and 100 fps...It's so much smooother... You also have to remember that fps-number is just the __Average__ number of frames. There's alot more work involved in rendering when you see a whole room with furniture, plants etc. than just a single wall. So if you're average is 30 fps, you get 30+ when see a wall and 30- when you see a more complex scene. (Hope ye get the idea...I'm no good story teller...) >> 60fps is the ideal target. > >I blindly follow the masses here, but I can't help wondering why... Anything >above 24-25 fps will not be noticed by the human eye, 30 fps animations look >_really_ smooth. So why are we all targetting for 60 fps? Shouldn't we >rather crank up the detail some more and all target 30 fps? What makes a 60 >fps game more playable than a 30 fps game? > >Jim Offerman > >Innovade >- designing the designer > > >_______________________________________________ >GDAlgorithms-list mailing list >GDA...@li... >http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ GDAlgorithms-list mailing list GDA...@li... http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list |
From: <Chr...@Pl...> - 2000-07-31 22:52:33
|
I was going to keep out of this as I don't have time to do it fully justice, but I feel I have to comment because a lot of incorrect stuff is being passed around in this thread. There are two concepts here that people keep getting mixed up: 1) Apparent motion 2) Flicker fusion Apparent motion is the term used to describe the visual phenomenon where the display of different distinct static images is perceived as continuous motion. This phenomenon known as the 'phi phenomenon', first happens at rather low frame rates, which is why movies are 24 fps and cartoons are typically half of that. Flicker fusion, however, happens at the frequency known as the critical flicker fusion frequency (CFF) and this is where flicker disappears. The CFF is much higher, around 60Hz, but varies from person to person, and increases with brightness, viewing angle (the eye being more sensitive to flicker in the peripheral), etc. This is why movies stills are shuttered *twice* to get an effective frame rate of 48 images per second, which works fine for a dark cinema. (If movie images were displayed at 24 images per second, things would flicker horribly.) For a bright living room, around 60 Hz is more appropriate, and for a large computer screen you want a much higher refresh rate because the viewing angle is larger (not to mention the fact that a computer screen has a faster decaying phosphor than a TV screen). The link someone posted earlier is to a horribly incorrect page and should be ignored. Instead, here are some links to a few good pages: http://www.futuretech.vuurwerk.nl/fps.html http://www.xabcs.demon.co.uk/ergonite/flicfaq.htm http://www.microsoft.com/hwdev/TVBROADCAST/TempRate.htm http://www.search.eb.com/bol/topic?eu=119396&sctn=4#s_top http://www.search.eb.com/bol/topic?eu=55424&sctn=1#160276 http://www.britannica.com/bcom/eb/article/6/0,5716,119396+3,00.html http://www.britannica.com/bcom/eb/article/5/0,5716,117505+13,00.html There's also this nice illustration of the phi phenomenon on this page: http://www.eecs.tufts.edu/~kreisman/phi/index.html Christer Ericson SCEA, Santa Monica |
From: gl <gl...@nt...> - 2000-07-31 23:20:43
|
> Apparent motion is the term used to describe the visual phenomenon where > the display of different distinct static images is perceived as continuous > motion. This phenomenon known as the 'phi phenomenon', first happens at > rather low frame rates, which is why movies are 24 fps and cartoons are > typically half of that. Yes, but motion in TV & film cannot be compared to motion in computer graphics, due to motion blur (see my other post). The biggest mistake in these arguments is justifying low gfx fps with TV/film fps - it's not a valid comparison. -- gl |
From: jason w. <jas...@po...> - 2000-07-31 23:48:35
|
> rather low frame rates, which is why movies are 24 fps and cartoons are > typically half of that. don't use traditional cartooning framerate as an example.. animators use *lots* of tricks in order to trick human perception.. more recently, I've started to see a lot of web flash animators rediscovering these fundamentals of animation to compensate for low framerates. console games often use the same tricks... especially fighting games. |
From: <Chr...@Pl...> - 2000-08-01 02:25:34
|
Jason Watkins wrote: >ohh.. btw, most cinema projectors actually flash each frame twice before >advancing to the next. they only started doing this as theaters started >getting larger and larger screens.. as the screen starts to dominate more >and more of the viewers fov, the viewer becomes more and more sensitive to >flashing, aliasing and so on. good to keep in mind when the norm right now >is a gamer sitting 2" away from a 19" monitor :) No they didn't start "doing this as theatres started getting larger and larger screens", I don't know where you got that from. They started double shuttering at the same time they made the move from the silent periods ca 16 fps (which was shown triple shuttered) to today's 24 fps (which is typically double shuttered). 16 * 3 = 24 * 2 incidentally. Christer Ericson SCEA, Santa Monica |
From: jason w. <jas...@po...> - 2000-08-01 03:03:59
|
> No they didn't start "doing this as theatres started getting larger > and larger screens", I don't know where you got that from. I got it from a projectionist friend, if you really care to know. Nice to know the real deal tho. |
From: Mark A. <MA...@ac...> - 2000-08-01 08:35:36
|
Why not just ask the player what they want and then scale the engine to that speed. If they want 'liquidity' at high frame rates then scale back and use lower level of detail models. If they can't tell the difference then they'll pull the frame rate down to 30fps and get better looking visuals. > -----Original Message----- > From: Steve Wood [mailto:Ste...@im...] > Sent: 31 July 2000 21:22 > To: 'gda...@li...' > Subject: RE: [Algorithms] FPS Questions > > > > -----Original Message----- > > From: Jim Offerman [mailto:j.o...@in...] > > > > > 60fps is the ideal target. > > > > I blindly follow the masses here, but I can't help wondering > > why... Anything > > above 24-25 fps will not be noticed by the human eye, 30 fps > > animations look > > _really_ smooth. So why are we all targetting for 60 fps? > Shouldn't we > > rather crank up the detail some more and all target 30 fps? > > What makes a 60 > > fps game more playable than a 30 fps game? > > > > I think the ideal target fps needs to detailed further since > fps is relative > to the bits per pixel and window or screen size. I'm > assuming that Keith's > ideal 60fps is at 640x480x32 fullscreen...so that someone can > play it at > 800x600x32 and get about 30fps, or 1024x768 and get the 24-25fps. > > R&R > > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > |
From: Jim O. <j.o...@in...> - 2000-08-01 15:09:27
|
I gave my original query some more thought, and I think I have found a plausible explanation as to why 60 fps might be perceived as being smoother than 30 fps. The human eye in many ways works like a camera (note: actually, it is the other way around): the retina is exposed to light for a small period of time and then the accumulated light signal is transmitted to the brain before the retina is exposed again. Let's assume for a while that the eyes record at a steady 30 fps. If our game also runs at 30 fps, the eye sees one frame at the time. However, if the game runs at 60 fps, the eye sees two frames at the time, which are blurred together, resulting in a form of motion blur. Another important aspect is that your eyes will _never_ be in sync with the frame rate of your game, so it is possible that there exists a moment where your eye records a frame, but there is nothing to record (since the monitor is doing a vblank). This will certainly be perceived as a discontinuity of the ongoing motion on the screen. The higher the framerate, the less likely that such situations occur. Finally, I must agree with the lower latency factor, since while our eyes may be relatively slow, our responses (generally) are _really_ fast. Specially, if someone is trained in some response, then it becomes a reflex... An experienced FPS player might be using his brain as little as 25% of the time, the rest of the time, his actions are merely reflexes. Hence the phrase 'mindless killer' ;-). > Why not just ask the player what they want and then scale the engine to that > speed. If they want 'liquidity' at high frame rates then scale back and use > lower level of detail models. If they can't tell the difference then > they'll pull the frame rate down to 30fps and get better looking visuals. We usually achieve this by offering the player some controls over detail. Though it might be nice if your engine includes a little util which finds the optimum detail settings to get n fps on a given machine... but I can tell you that _won't_ be very easy, so better put that on the 'things to do when I have time left' list ;-). Jim Offerman Innovade - designing the designer |
From: Tom F. <to...@mu...> - 2000-08-01 15:42:40
|
...except that eye's don't have a "shutter" or a "framerate" - they have a response time, but it's continuous, not discrete like a movie camera. So you can't say that the eyes are "out of sync" - they don't _have_ a sync. As for finding the optimum details settings to get "n"Hz on a machine, with continuous LoD methods, you can get quite close (+/- 5Hz is not too hard to achieve 95% of the time). You have some sensible defaults for rendering quality (i.e. how many passes), according to card type, allow the user to override them if they feel like it, and then CLOD up or down to get the right frame rate (within sane limits). Tom Forsyth - Muckyfoot bloke. Whizzing and pasting and pooting through the day. > -----Original Message----- > From: Jim Offerman [mailto:j.o...@in...] > Sent: 01 August 2000 15:58 > To: Algorithms List > Subject: Re: [Algorithms] FPS Questions > > > I gave my original query some more thought, and I think I have found a > plausible explanation as to why 60 fps might be perceived as > being smoother > than 30 fps. > > The human eye in many ways works like a camera (note: > actually, it is the > other way around): the retina is exposed to light for a small > period of time > and then the accumulated light signal is transmitted to the > brain before the > retina is exposed again. > > Let's assume for a while that the eyes record at a steady 30 > fps. If our > game also runs at 30 fps, the eye sees one frame at the time. > However, if > the game runs at 60 fps, the eye sees two frames at the time, > which are > blurred together, resulting in a form of motion blur. > > Another important aspect is that your eyes will _never_ be in > sync with the > frame rate of your game, so it is possible that there exists > a moment where > your eye records a frame, but there is nothing to record > (since the monitor > is doing a vblank). This will certainly be perceived as a > discontinuity of > the ongoing motion on the screen. The higher the framerate, > the less likely > that such situations occur. > > Finally, I must agree with the lower latency factor, since > while our eyes > may be relatively slow, our responses (generally) are _really_ fast. > Specially, if someone is trained in some response, then it becomes a > reflex... An experienced FPS player might be using his brain > as little as > 25% of the time, the rest of the time, his actions are merely > reflexes. > Hence the phrase 'mindless killer' ;-). > > > Why not just ask the player what they want and then scale > the engine to > that > > speed. If they want 'liquidity' at high frame rates then > scale back and > use > > lower level of detail models. If they can't tell the > difference then > > they'll pull the frame rate down to 30fps and get better > looking visuals. > > We usually achieve this by offering the player some controls > over detail. > Though it might be nice if your engine includes a little util > which finds > the optimum detail settings to get n fps on a given > machine... but I can > tell you that _won't_ be very easy, so better put that on the > 'things to do > when I have time left' list ;-). > > Jim Offerman > > Innovade > - designing the designer |
From: Jim O. <j.o...@in...> - 2000-08-01 18:21:23
|
> ...except that eye's don't have a "shutter" or a "framerate" - they have a > response time, but it's continuous, not discrete like a movie camera. So you > can't say that the eyes are "out of sync" - they don't _have_ a sync. Not entirely true; The cells in your retina do need some time to 'discharge' (can't remember the proper term from my biology classes...) after they have received some light (and require some time to 'charge' before a new signal is sent to the brain). This is related to the fact that your eyes need time to adjust when you go from darkness into the light (though this is primarily caused by your pupils being wide open in the dark, so when you step into the light you experience a sort of overflow effect). I am pretty convinced that the eyes send discrete images to the brain, though maybe not as discrete as the frames in a computer game. However, the brain fills in the gaps. In fact, your brain does this all the time, with all sensory input. Most of the time, when you see a certain object, you only see it because the brain _expects_ to see that object. Jim Offerman Innovade - designing the designer |
From: Conor S. <cs...@tp...> - 2000-08-02 03:50:26
|
This is where I make evil confessions of things I've done :) Well, the main evil thing is actually using a form of AI system to calculate LoD levels on the fly to get a constant frame rate in specific environments. A genetic algorithm which modified a neural network style structure on the fly. The neural network style object used an kd-tree structure with output vectors stored at corner points of the bounding hyper boxes. The current input vector was pushed down the tree until it found the node, and linear interpolation between the closest output values was used. If the frame speed was outside a error metric, then a modifier was used. The modifiers were genetically terminated also. If a modifier didn't fix a problem, it was modified as well. The system worked quite well. The kd-tree structure allowed additional non linear information to be accounted for. Twas a rather evil thing to do. Maybe one day I'll do a demo. The output vectors simply allowed me to map LoD to priorities (eg, the camera distance) and complexity (eg, how long one LoD element takes) to output a fairly reasonable estimate in most circumstances. I don't know why I feel I need to voice this here and now, but I do :) And I will probably be laughed at for the sheer insanity of the idea ;) Mwhuhahahahaha. Conor Stokes > As for finding the optimum details settings to get "n"Hz on a machine, with > continuous LoD methods, you can get quite close (+/- 5Hz is not too hard to > achieve 95% of the time). You have some sensible defaults for rendering > quality (i.e. how many passes), according to card type, allow the user to > override them if they feel like it, and then CLOD up or down to get the > right frame rate (within sane limits). > > Tom Forsyth - Muckyfoot bloke. > Whizzing and pasting and pooting through the day. |
From: Tom F. <to...@mu...> - 2000-08-01 17:24:36
|
Saccade - that was it - ta. Except that if the graphics move fast, your eyes _will_ follow them (and then "flick" back to the other side of the screen, inducing saccade). If they don't, that's a sign of neurological problems - it's one of the low-level reflexes that keep your eyes tracking moving objects, and it's very hard to override without defocussing your eyes. The things you learn when both parents are medics... :-) Tom Forsyth - Muckyfoot bloke. Whizzing and pasting and pooting through the day. > -----Original Message----- > From: Stephen J Baker [mailto:sj...@li...] > Sent: 01 August 2000 17:17 > To: gda...@li... > Subject: RE: [Algorithms] FPS Questions > > > On Mon, 31 Jul 2000, Tom Forsyth wrote: > > > Lower latency - that's what FPS players from higher fps > (excuse the pun). > > They also say that you can spin faster and not have any > gaps in your visual > > coverage of a room. I'll have to trust them on that - I > have no problems > > scanning a room quickly on my fairly average setup, which > is probably going > > at around 25-30Hz. If you spin too fast, your optical > shutters kick in (I > > used to know what they were called - anyone? > > Saccade. > > > - they stop you getting confused by rapid head movements). > > But I don't think that spinning the graphics around fast will > induce a saccade - your eyes have to move to make that happen. > > Steve Baker (817)619-2657 (Vox/Vox-Mail) > L3Com/Link Simulation & Training (817)619-2466 (Fax) > Work: sj...@li... http://www.link.com > Home: sjb...@ai... http://web2.airmail.net/sjbaker1 > > > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > |
From: Tom F. <to...@mu...> - 2000-08-01 18:12:50
|
I like the theory - shame move theaters show each image twice really, otherwise it'd be a really good one. Though to be fair I see double images when movies pan (really bad flickery double images too - ugh) - so your theory still holds. "it evolved to throw rocks at rabbits running behind trees." How true. How Quake :-) Tom Forsyth - Muckyfoot bloke. Whizzing and pasting and pooting through the day. > -----Original Message----- > From: Stephen J Baker [mailto:sj...@li...] > Sent: 01 August 2000 18:09 > To: gda...@li... > Subject: RE: [Algorithms] FPS Questions > > > On Mon, 31 Jul 2000, Graham S. Rhodes wrote: > > > I like Jason's post. It is fairly consistent with my own > observations, which > > I describe here. He may not agree with my whole discussion, though. > > This stuff is well researched and understood. > > Flight simulator people have been aware of the issue for 20 years. > > > The old "fact" that 25-30 fps is enough for animation that appears > > continuous to the human eye was discovered when viewing > movies of fairly > > slow moving objects. Slow compared to FPS games during > attack sequences, for > > example. It just happens that 60 fps or 100 fps or 25 or 30 > fps just works > > out nicely for most current games. > > There is also a BIG difference between 24Hz movies and (say) > 30Hz video. > > In a 24Hz movie, each image is only drawn once - so there are > 24 separate > still frames. > > <apologies to people who've heard me explain this many times before> > > In a 30Hz video animation on a 60Hz CRT, each image is drawn TWICE by > the CRT - so there are 60 separate still frames - with two consecutive > identical images being painted onto the phosphor between each > swapbuffer > call. > > That distinction might not seem to matter - so you'd expect > 30Hz graphics > to look better than 24Hz movies - however, they don't - and > here is why: > > Our brains evolved for tasks like watching a small furry > animal running > around in a forest - then letting us throw a rock at it and > stand a good > chance of hitting it. > > That means that when the cute bunny runs along and tree > trunks, bushes, > etc get between it and us, we have to mentally interpolate > it's position > in order to fill in the gaps in the imagery coming from our eyes. If > your brain didn't do that, you'd think that you were seeing a set of > separate disconnected events - and throwing a rock would be > impossible. > > That hardwired interpolation is what screws up our perception of 30Hz > animation on 60Hz video. > > Look at a graph of position against time for an object moving > at a constant speed - displayed using 30Hz graphics on a 60Hz video > screen: > > | > | . . > ^ | . . > | | . . > posn | . . > | . . > | . . > | . . > |____________________________ > time -> > > Linear motion - but two consecutive images at each position - right? > > Well, when your brain tries to interpolate between those still > images, it tries to make a straight line through the points, > but it can't - it's a stair-step function. > > However, one way to view this graph is as TWO parallel straight > lines. You can easily draw two parallel lines through those > points - and they fit the data perfectly. > > That's what your brain does. So you don't see ONE object moving > jerkily - you see TWO objects moving smoothly but flickering > at 30Hz. This means that all fast moving objects double-image > at 30Hz - and they flicker too. > > When the graphics are only updated at 20Hz, you get triple-imaging, > as you'd expect. But there comes a point at poor enough frame rates > at which your brain accepts that this is jerky motion and not > multiple moving objects. > > For me, that sometimes happens at 20Hz and sometimes at 15. It seems > to depend on the ambient lighting. Somewhere around that speed, I can > sometimes 'flip' my brain between seeing multiple images and jerkiness > by concentrating on that - just like you can make some > optical illusions > flip between two states by staring hard at them. > > Different people hit this effect at different speeds. One of > my co-workers > can see quadruple-images at 15Hz - I've never seen that. > > In a movie theater, each image is only painted once - so a perfect > straight line can be drawn through the points and no double-imaging > is aparrent - although the flicker is bad and the heavy motion blur > can get ugly. > > > Does anyone have any reference that states that the human > eye *cannot* > > resolve *much* better than, say, 100 fps? > > No - to the contrary. There are people who can resolve much > better than > 100Hz. If you run a CRT at 120Hz but update the graphics at 60, the > double imaging comes back. That proves that your eyes are > still seeing > a non-continuous image - even if your higher cognitive centers don't > notice it. It would be interesting to do that with 200Hz > video and 100Hz > rendering - but I don't have a CRT that'll go that fast. > > I suspect the limit would be the persistance of the phosphor > rather than > a limitation of the eye/brain. All those neurons are firing > asynchronously > so there will always be some that see the interruption to the light. > > The idea that the eye/brain somehow doesn't see the black > periods between > the redraws at over 20Hz is a fallacy. What happens is that the > interrupted image is reconstructed into smooth interpolated motion for > your higher level cognitive functions...but when (redraw rate != video > rate) your mental interpolator gets confused because it evolved to > throw rocks at rabbits running behind trees. > > Obviously humanity may sometime evolve to be able to interpolate 30Hz > images - but that presumes that people who play a lot of video games > successfully will be more likely to pass on their genes to the next > generation. The reality is probably the opposite of that! :-) > > > What happens in 5 years when we all have monitors running > at 2000 pixels > > horizontally? Large immersive displays such as CAVE's > already run at 96 > > frames per second at that resolution... What happens when > we are all using > > stereoscopic hardware all the time, so that each eye sees > have the total > > frame rate. 100fps becomes 50fps. We will then need 200fps > to match today's > > 100fps..... Food for thought. > > No - that's not true. We've run 3500x2200 pixel screens and > 60Hz still looks > just as good as it does on a 1000 pixel screen. > > Steve Baker (817)619-2657 (Vox/Vox-Mail) > L3Com/Link Simulation & Training (817)619-2466 (Fax) > Work: sj...@li... http://www.link.com > Home: sjb...@ai... http://web2.airmail.net/sjbaker1 |
From: Keith Z.L. <ke...@dr...> - 2000-08-01 18:22:42
|
Actually, if we are going to be nit-picky the ~60hz of TV is interlaced, so you are seeing half-frames, with an actual "frame" rate of 29.97hz Are movies also interlaced?? I thought that I read somewhere that movies had moved to a 48 hz refresh. Also, none of this really matters to computer frame rate discussion, the end result is that people can see the difference between 30 and 60 fps, and the change is significant. Keith |
From: Stephen J B. <sj...@li...> - 2000-08-01 22:53:25
|
On Tue, 1 Aug 2000, Keith Z.Leonard wrote: > Actually, if we are going to be nit-picky the ~60hz of TV is interlaced, so > you are seeing half-frames, with an actual "frame" rate of 29.97hz Well, each scanline is painted ~30 times a second - but it draws all the odd numbers lines in one 1/60th second period and all the even numbered scanlines on the next 1/60th second. Hence you see changes in the position of objects 60 times a second - so double imaging shouldn't happen as it would in a true 30Hz setup. ...and for some countries, it's 25Hz and 50Hz rather than 30 and 60 as it is here in USA. What you get with TV is interlace flicker where one pixel high lines only show up every alternate field - 30Hz. Fortunately TV cameras and TV graphics systems know about this and are very careful to deal with it. There are also some very peculiar psycho-physical effects when an object is moving vertically up or down the screen at more or less exactly one scanline per field - this is called 'entrainment' - and I don't understand it. It makes some people very ill and other people mis-judge the speed of things. So what do console games do about interlace artifacts? Are they actually running at half the vertical resolution of broadcast TV - or do they have fancy antialiasing filters to deal with it? > Are movies also interlaced?? No - you can't interlace a film - it doesn't have raster structure. > I thought that I read somewhere that movies had moved to a 48 hz refresh. So we are reliably informed (I didn't know that before) - they display the same entire image twice though - similar to PC's running at 30Hz. > Also, none of this really matters to computer frame rate discussion, > the end result is that people can see the difference between 30 and > 60 fps, and the change is significant. For fast-moving images - yes. Steve Baker (817)619-2657 (Vox/Vox-Mail) L3Com/Link Simulation & Training (817)619-2466 (Fax) Work: sj...@li... http://www.link.com Home: sjb...@ai... http://web2.airmail.net/sjbaker1 |
From: <Lea...@en...> - 2000-08-02 00:10:16
|
> So what do console games do about interlace artifacts? Are they actually > running at half the vertical resolution of broadcast TV - or do they have > fancy antialiasing filters to deal with it? It's just in the HW to pass the image out to your TV... as far as the console is concerned, it's just a set resolution, say 320x240. The image is split at the HW just before the video cable and after the internal rasterization etc HW... Same as the ol' Amiga, Atari ST, etc... Leathal. |
From: Stephen J B. <sj...@li...> - 2000-08-02 13:48:43
|
On Wed, 2 Aug 2000, Leath Muller wrote: > > So what do console games do about interlace artifacts? Are they actually > > running at half the vertical resolution of broadcast TV - or do they have > > fancy antialiasing filters to deal with it? > > It's just in the HW to pass the image out to your TV... as far as the > console is concerned, it's just a set resolution, say 320x240. The image > is split at the HW just before the video cable and after the internal > rasterization etc HW... Same as the ol' Amiga, Atari ST, etc... Ah - that explains it. Since TV's have ~480 scanlines (NTSC) - or 640 (PAL), the console must be repeating each scanline twice at 320x240, so there is no interlace issue and it can effectively pretend to be a 60Hz non-interlaced display. Steve Baker (817)619-2657 (Vox/Vox-Mail) L3Com/Link Simulation & Training (817)619-2466 (Fax) Work: sj...@li... http://www.link.com Home: sjb...@ai... http://web2.airmail.net/sjbaker1 |
From: Stephen J B. <sj...@li...> - 2000-08-01 22:39:31
|
On Tue, 1 Aug 2000, Tom Forsyth wrote: > I like the theory - shame move theaters show each image twice really, > otherwise it'd be a really good one. Yes - I didn't know they did that - but if you see double imaging in movies - then I'm still right about the 30/60Hz issues in games which is what matters. > Though to be fair I see double images > when movies pan (really bad flickery double images too - ugh) - so your > theory still holds. Yes - I guess I've never noticed the double imaging in movies because a) Even 48Hz stinks. b) Excessive motion blur to compensate for 48Hz stinks too. > "it evolved to throw rocks at rabbits running behind trees." How true. How > Quake :-) :-) Steve Baker (817)619-2657 (Vox/Vox-Mail) L3Com/Link Simulation & Training (817)619-2466 (Fax) Work: sj...@li... http://www.link.com Home: sjb...@ai... http://web2.airmail.net/sjbaker1 |
From: Pallister, K. <kim...@in...> - 2000-08-01 20:43:36
|
I too, can boldly propegate the never ending thread! Tom's point is correct, but not as simple as that. The eye doesn't just have a 'response time' (time from when light hits the retina until the rods & cones on the retina do their thing and call up the brain to say they've seen something). The eye actually has a response curve, and it changes for different wavelengths (color) of light and amplitude. After a similarly never-ending thread on this subject (eye response, maximum frame rates, etc) about a year ago I went out and looked for research on it. Turns out there has been quite a bit, most by NASA, the airforce, and various optical/vision societies. A couple I did find were: Window of Visibility: a psychophysical theory of fidelity in time sampled visual motion displays (Watson et al, 85] The Optimal Motion Simulus (Watson et al, 94) Anyhow, if I remember correctly, what it comes down to is that the frame rate at which people can no longer perceive discrete frames is somewhere between 40 and 90 frames per second (yeah, I know some of you disagree and will say it's higher), and this depends on: - speed of objects moving - size of objects on screen - color of objects, background, contrast between - lighting - etc - etc and oh yeah, everyone's response curves are different. Which leads me to a question: do you folks think that the eyes & brain can be trained to better perceive such errors? The movies never used to bother me until I became a graphics programmer. Now it bugs me. Also, PAL didn't used to bother me, but now I find the lower frame rate intolerable, which is why I make a point of not watching TV when I am in the UK, but spend the time in pubs instead! Kim Pallister We will find a way or we will make one. - Hannibal > -----Original Message----- > From: Tom Forsyth [mailto:to...@mu...] > Sent: Tuesday, August 01, 2000 8:38 AM > To: gda...@li... > Subject: RE: [Algorithms] FPS Questions > > > ...except that eye's don't have a "shutter" or a "framerate" > - they have a > response time, but it's continuous, not discrete like a movie > camera. So you > can't say that the eyes are "out of sync" - they don't _have_ a sync. > > As for finding the optimum details settings to get "n"Hz on a > machine, with > continuous LoD methods, you can get quite close (+/- 5Hz is > not too hard to > achieve 95% of the time). You have some sensible defaults for > rendering > quality (i.e. how many passes), according to card type, allow > the user to > override them if they feel like it, and then CLOD up or down > to get the > right frame rate (within sane limits). > > Tom Forsyth - Muckyfoot bloke. > Whizzing and pasting and pooting through the day. > > > -----Original Message----- > > From: Jim Offerman [mailto:j.o...@in...] > > Sent: 01 August 2000 15:58 > > To: Algorithms List > > Subject: Re: [Algorithms] FPS Questions > > > > > > I gave my original query some more thought, and I think I > have found a > > plausible explanation as to why 60 fps might be perceived as > > being smoother > > than 30 fps. > > > > The human eye in many ways works like a camera (note: > > actually, it is the > > other way around): the retina is exposed to light for a small > > period of time > > and then the accumulated light signal is transmitted to the > > brain before the > > retina is exposed again. > > > > Let's assume for a while that the eyes record at a steady 30 > > fps. If our > > game also runs at 30 fps, the eye sees one frame at the time. > > However, if > > the game runs at 60 fps, the eye sees two frames at the time, > > which are > > blurred together, resulting in a form of motion blur. > > > > Another important aspect is that your eyes will _never_ be in > > sync with the > > frame rate of your game, so it is possible that there exists > > a moment where > > your eye records a frame, but there is nothing to record > > (since the monitor > > is doing a vblank). This will certainly be perceived as a > > discontinuity of > > the ongoing motion on the screen. The higher the framerate, > > the less likely > > that such situations occur. > > > > Finally, I must agree with the lower latency factor, since > > while our eyes > > may be relatively slow, our responses (generally) are _really_ fast. > > Specially, if someone is trained in some response, then it becomes a > > reflex... An experienced FPS player might be using his brain > > as little as > > 25% of the time, the rest of the time, his actions are merely > > reflexes. > > Hence the phrase 'mindless killer' ;-). > > > > > Why not just ask the player what they want and then scale > > the engine to > > that > > > speed. If they want 'liquidity' at high frame rates then > > scale back and > > use > > > lower level of detail models. If they can't tell the > > difference then > > > they'll pull the frame rate down to 30fps and get better > > looking visuals. > > > > We usually achieve this by offering the player some controls > > over detail. > > Though it might be nice if your engine includes a little util > > which finds > > the optimum detail settings to get n fps on a given > > machine... but I can > > tell you that _won't_ be very easy, so better put that on the > > 'things to do > > when I have time left' list ;-). > > > > Jim Offerman > > > > Innovade > > - designing the designer > > > > > > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > |
From: gl <gl...@nt...> - 2000-08-10 15:46:20
|
> Which leads me to a question: do you folks think that the eyes & brain can > be trained to better perceive such errors? Of course. True of anything you study (in the loosest sense of the word). > The movies never used to bother > me until I became a graphics programmer. Now it bugs me. Also, PAL didn't > used to bother me, but now I find the lower frame rate intolerable, which is > why I make a point of not watching TV when I am in the UK, but spend the > time in pubs instead! Smile. -- gl |
From: Pai-Hung C. <pa...@ac...> - 2000-08-10 17:17:51
|
Hi, Could someone explain briefly to me what is "rotoscoping" used in animation? Thanks in advance, Pai-Hung Chen |
From: BSteve B. <st...@li...> - 2000-08-10 17:46:56
|
On Thu, 10 Aug 2000, Pai-Hung Chen wrote: > Could someone explain briefly to me what is "rotoscoping" used in animation? I believe it refers to manually tracing over live-action movie frames in order to get realistic movement in a cartoon. Presumably, a 'Rotoscope' is some kind of optical instrument that projects the movie frames onto the tracing surface...dunno about that though. Motion-capture in a pre-computer age! Steve Baker (817)619-2657 (Vox/Vox-Mail) L3Com/Link Simulation & Training (817)619-2466 (Fax) Work: sj...@li... http://www.link.com Home: sjb...@ai... http://web2.airmail.net/sjbaker1 |
From: Bass, G. T. <gt...@ut...> - 2000-08-01 21:37:14
|
Algorithms folks, I'm surprised no one has mentioned thus far the most significant difference between movie frames and real-time gfx frames. I know I drag you all through this each time this discussion comes up, but I'll rehash once more. If you look at a single frame from a movie, you will notice that moving objects are blurred, since the film is exposed for a short period of time and records light reflected from moving objects continuously during that time. This is the well known "motion blur", which, incidentally, looks nothing at all like what 3dfx acheives with their T-buffer. Motion blur has the effect of connecting the multiple discrete images into a more complete recreation of motion as the eye is sequentially exposed to each frame, essentially creating a near overlap between each frame. Examination of a single frame rendered by a real-time gfx system will show something quite different. Objects in the single frame appear to be totally static, and there is no hint of motion. For this reason, higher framerates become important as 3D objects move more quickly across the screen, as these fast objects will appear more disconnected if allowed to move very far (in screenspace) before they are rendered again. A good example of this would be roadside objects in a driving game. As you pass lightpoles and other such goodies, they begin to cover screenspace very quickly near the outside edges of the screen, and may even appear to move backwards at certain speeds when the next object is closer to the location of the previous object from one frame to the next. Regards, Garett Bass gt...@ut... |