[GD-Linux] Algorithm help
Brought to you by:
vexxed72
|
From: Jan E. <ch...@in...> - 2001-11-26 15:05:06
|
Hi all,
I have a small problem that my brain just can't work out. Being pretty
useless at maths I thought about asking here, maybe someone has done
something similar.
I have a 3D scene in which a "camera" glides over the terrain in normal
FPS mode. The terrain is a normal 2D matrix of y-values, with the x- and
z-coordinates spread evenly, i.e. a perfectly regular matrix. This works
fine, and I take the y-coordinate for the camera (height above the
terrain) from the matrix. So, if the camera is at (x,z) I get the camera
height from something like this:
=09height =3D map [int(x)][int(y)]
This of course gets me the height of the lower corner of the current
"tile". Some helpful ascii art:
|D |C |
--+--------+--------+--
| | |
| X | |
| | |
|A |B |
--+--------+--------+--
| | |
| | |
| | |
| | |
--+--------+--------+--
| |=A0 |
So if the camera is at X the idea above gives me that that camera should
have the height at point A in the map as long as the camera is in that
tile. If it moves on tile to the right it gets the height of point B. This
gives a camera that is jerky, especielly if the heights in A,B,C and D
differ even slightly. The correct way would thus be to interpolate the
heights of all the points somehow, so that the camera would "flow"
smoothly over the tile. Any ideas on how to do this? If the tiles were
much smaller that problem would not exist, but my tiles are quite big.
I've tried to STFW, but I don't really know what to look for. Anyone got
some pointers to some nice math that I could use, or have done this
before?
Regards,
Chakie
--=20
"Students?" barked the Archchancellor.
"Yes, Master. You know? They're the thinner ones with the pale faces?
Because we're a university? They come with the whole thing, like rats --"
-- Terry Pratchett, Moving Picture=
s
|