RE: [Algorithms] adaptive radiosity and lightmaps
Brought to you by:
vexxed72
From: Tom F. <tom...@ee...> - 2004-03-22 21:42:08
|
We re-traced the lighting with the new sampling, but using multiple (adaptive) samples over the area of the new texels. I was predicting = cases where the original sampling wasn't good enough to spot a feature that = would prevent shrinking, and then the lower density sampling blew this tiny artefact up, and other pathological cases like that - but it never = seemed to happen in practice. Yay! One of the things we did was make the initial sample res pretty small, = and then pieces would be enlarged as well as shrunk before being retraced. = This chopped a lot off the tracing time! Where a whole triangle was entirely = lit or entirely shadowed, it would shrink to a very small size and there = would be very few traces, which meant we could devote a lot more traces and precision to the parts with shadows (especially sharp-edged shadows). ("trace" =3D raytrace to see if a light is visible from a texel or not - = this was a pure direct-lit lighting model - the artists preferred this to radiosity-style indirect bounce stuff - any time they needed indirect lighting they'd fake it with other lights, which meant the realtime = lighting matched better as well) The shrink metric was done just by mipmapping that bit of the texture, blowing it back up with a bilinear filter and asking what the greatest = error was. There's some fiddly stuff to ignore texels that aren't used by any tris. TomF. > What technique did you use for shrinking the triangles after you had > generated the lightmap? Did you recompute lighting with the new > parameterisation, or just resample the old bitmap? >=20 > -- > Chris Butcher > Networking & Simulation Lead > Halo 2 | Bungie Studios > bu...@bu... |