Re: [GD-Linux] Algorithm help
Brought to you by:
vexxed72
From: Eero P. <epa...@ko...> - 2001-11-26 18:08:29
|
Jan Ekholm wrote: > I have a 3D scene in which a "camera" glides over the terrain in normal > FPS mode. The terrain is a normal 2D matrix of y-values, with the x- and > z-coordinates spread evenly, i.e. a perfectly regular matrix. This works > fine, and I take the y-coordinate for the camera (height above the > terrain) from the matrix. So, if the camera is at (x,z) I get the camera > height from something like this: > > height = map [int(x)][int(y)] > ........... > So if the camera is at X the idea above gives me that that camera should > have the height at point A in the map as long as the camera is in that > tile. If it moves on tile to the right it gets the height of point B. This > gives a camera that is jerky, especielly if the heights in A,B,C and D > differ even slightly. The correct way would thus be to interpolate the > heights of all the points somehow, so that the camera would "flow" > smoothly over the tile. Any ideas on how to do this? If the tiles were > much smaller that problem would not exist, but my tiles are quite big. > > I've tried to STFW, but I don't really know what to look for. Anyone got > some pointers to some nice math that I could use, or have done this > before? If I understood you question correctly (and also the FLA).... The simplest solution I guess is to use bilinear interpolation. Google can help with the details. Eero |