You can subscribe to this list here.
2000 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}
(390) 
_{Aug}
(767) 
_{Sep}
(940) 
_{Oct}
(964) 
_{Nov}
(819) 
_{Dec}
(762) 

2001 
_{Jan}
(680) 
_{Feb}
(1075) 
_{Mar}
(954) 
_{Apr}
(595) 
_{May}
(725) 
_{Jun}
(868) 
_{Jul}
(678) 
_{Aug}
(785) 
_{Sep}
(410) 
_{Oct}
(395) 
_{Nov}
(374) 
_{Dec}
(419) 
2002 
_{Jan}
(699) 
_{Feb}
(501) 
_{Mar}
(311) 
_{Apr}
(334) 
_{May}
(501) 
_{Jun}
(507) 
_{Jul}
(441) 
_{Aug}
(395) 
_{Sep}
(540) 
_{Oct}
(416) 
_{Nov}
(369) 
_{Dec}
(373) 
2003 
_{Jan}
(514) 
_{Feb}
(488) 
_{Mar}
(396) 
_{Apr}
(624) 
_{May}
(590) 
_{Jun}
(562) 
_{Jul}
(546) 
_{Aug}
(463) 
_{Sep}
(389) 
_{Oct}
(399) 
_{Nov}
(333) 
_{Dec}
(449) 
2004 
_{Jan}
(317) 
_{Feb}
(395) 
_{Mar}
(136) 
_{Apr}
(338) 
_{May}
(488) 
_{Jun}
(306) 
_{Jul}
(266) 
_{Aug}
(424) 
_{Sep}
(502) 
_{Oct}
(170) 
_{Nov}
(170) 
_{Dec}
(134) 
2005 
_{Jan}
(249) 
_{Feb}
(109) 
_{Mar}
(119) 
_{Apr}
(282) 
_{May}
(82) 
_{Jun}
(113) 
_{Jul}
(56) 
_{Aug}
(160) 
_{Sep}
(89) 
_{Oct}
(98) 
_{Nov}
(237) 
_{Dec}
(297) 
2006 
_{Jan}
(151) 
_{Feb}
(250) 
_{Mar}
(222) 
_{Apr}
(147) 
_{May}
(266) 
_{Jun}
(313) 
_{Jul}
(367) 
_{Aug}
(135) 
_{Sep}
(108) 
_{Oct}
(110) 
_{Nov}
(220) 
_{Dec}
(47) 
2007 
_{Jan}
(133) 
_{Feb}
(144) 
_{Mar}
(247) 
_{Apr}
(191) 
_{May}
(191) 
_{Jun}
(171) 
_{Jul}
(160) 
_{Aug}
(51) 
_{Sep}
(125) 
_{Oct}
(115) 
_{Nov}
(78) 
_{Dec}
(67) 
2008 
_{Jan}
(165) 
_{Feb}
(37) 
_{Mar}
(130) 
_{Apr}
(111) 
_{May}
(91) 
_{Jun}
(142) 
_{Jul}
(54) 
_{Aug}
(104) 
_{Sep}
(89) 
_{Oct}
(87) 
_{Nov}
(44) 
_{Dec}
(54) 
2009 
_{Jan}
(283) 
_{Feb}
(113) 
_{Mar}
(154) 
_{Apr}
(395) 
_{May}
(62) 
_{Jun}
(48) 
_{Jul}
(52) 
_{Aug}
(54) 
_{Sep}
(131) 
_{Oct}
(29) 
_{Nov}
(32) 
_{Dec}
(37) 
2010 
_{Jan}
(34) 
_{Feb}
(36) 
_{Mar}
(40) 
_{Apr}
(23) 
_{May}
(38) 
_{Jun}
(34) 
_{Jul}
(36) 
_{Aug}
(27) 
_{Sep}
(9) 
_{Oct}
(18) 
_{Nov}
(25) 
_{Dec}

2011 
_{Jan}
(1) 
_{Feb}
(14) 
_{Mar}
(1) 
_{Apr}
(5) 
_{May}
(1) 
_{Jun}

_{Jul}

_{Aug}
(37) 
_{Sep}
(6) 
_{Oct}
(2) 
_{Nov}

_{Dec}

2012 
_{Jan}

_{Feb}
(7) 
_{Mar}

_{Apr}
(4) 
_{May}

_{Jun}
(3) 
_{Jul}

_{Aug}

_{Sep}
(1) 
_{Oct}

_{Nov}

_{Dec}
(10) 
2013 
_{Jan}

_{Feb}
(1) 
_{Mar}
(7) 
_{Apr}
(2) 
_{May}

_{Jun}

_{Jul}
(9) 
_{Aug}

_{Sep}

_{Oct}

_{Nov}

_{Dec}

2014 
_{Jan}
(14) 
_{Feb}

_{Mar}
(2) 
_{Apr}

_{May}
(10) 
_{Jun}

_{Jul}

_{Aug}

_{Sep}

_{Oct}

_{Nov}
(3) 
_{Dec}

S  M  T  W  T  F  S 

1
(7) 
2
(36) 
3
(40) 
4
(15) 
5
(39) 
6
(18) 
7
(4) 
8
(11) 
9
(13) 
10
(22) 
11
(38) 
12
(28) 
13
(12) 
14
(4) 
15
(1) 
16
(24) 
17
(18) 
18
(22) 
19
(27) 
20
(51) 
21
(1) 
22
(2) 
23
(19) 
24
(41) 
25
(48) 
26
(52) 
27
(16) 
28

29
(2) 
30
(30) 
31
(37) 




From: Ignacio Castano <i6298@ho...>  20010716 22:25:32

yep, you are right! sorry!! Ignacio Castano castanyo@... Simon O'Connor wrote: > Flipped normals is only one of the problems, which is easily solved either > by your method of using the original normal or comparing the S x T vector > with the normal and adjusting its sign. > > The greater problem is shown in the following: > > c > / \ > / A \ > a /_____\ b > \ / > \ B / > \ / > d > > Polygon A and B are mapped with the same texture, vertices a and b are > shared by both polygons. > > Now imagine the texture v coordinate at vertex b is 0, and the v at vertex c > is 1 and at vertex d is 1... > > ...Now calculate the texture space basis for vertex b... the problem is for > polgon A the v mapping direction bc points in the opposite direction to bd > of polygon B. > > The easiest solution seems to be to spot the problem when you're indexing > the mesh (say at export time), and don't share the vertices which are > affected. > 
From: Vincent Caron <v.caron@ze...>  20010716 19:52:25

I like the documents from IASIG (http://www.iasig.org), especially the "3D Audio Rendering and Evaluation Guideline" level 1 and 2 docs. This might be a little off topic in your case, but the level 2 should give some useful information : level 1: http://www.iasig.org/wg/3DWG/3dl1v1.pdf level 2: http://www.iasig.org/wg/3DWG/3dl2v1a.pdf These are the references doc for Creative's EAX and OpenAL. 
From: Lucas Ackerman <ackerman7@ll...>  20010716 19:01:54

you might want to look at the SIGGRAPH 01 sound synthesis paper by James F. O'Brien (http://www.cs.berkeley.edu/~job/). He gave a dryrun talk here last monday which I gather was interesting (I just missed it, my desk was full of ants!), as it was discussed with him quite a bit at lunch. Lucas Brian Sharp wrote: > > Hey, everyone, > > I'm looking for some references on sound modeling, specifically on physical > modeling of sound propagation through materials and the like. There's lots > on physical modeling as it pertains to, say, physically modeling a bassoon, > but I'm not so much looking for stuff on instrument physics (as that's more > on synthesis than on filtering and modification). So far I've grabbed the > A3D 3.0 SDK with the wavetracing stuff, and I have yet to see whether they > document their algorithms well (since it is / was implemented as a hardware > solution, I don't have terribly high hopes, though.) > > Working with graphics has spoiled me  so easy to find good resources on > the net and in books. Sound seems a little less popular as far as > documentation goes. :) > > TIA! > > .b > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: Daniel Vogel <vogel@ep...>  20010716 18:51:28

> on synthesis than on filtering and modification). So far I've grabbed the > A3D 3.0 SDK with the wavetracing stuff, and I have yet to see > whether they document their algorithms well (since it is / was implemented as > a hardware solution, I don't have terribly high hopes, though.) I guess you can call it a HW solution if having the CPU do all the work counts ;) > Working with graphics has spoiled me  so easy to find good resources on > the net and in books. Sound seems a little less popular as far as > documentation goes. :) Search the web for documents from Jean M. Jot and Keith Charley (both Creative). Sadly I can't find their slideshows/ presentations on developer.creative.com.  Daniel Vogel, Programmer, Epic Games Inc. 
From: Brian Sharp <bsharp@io...>  20010716 18:40:43

Hey, everyone, I'm looking for some references on sound modeling, specifically on physical modeling of sound propagation through materials and the like. There's lots on physical modeling as it pertains to, say, physically modeling a bassoon, but I'm not so much looking for stuff on instrument physics (as that's more on synthesis than on filtering and modification). So far I've grabbed the A3D 3.0 SDK with the wavetracing stuff, and I have yet to see whether they document their algorithms well (since it is / was implemented as a hardware solution, I don't have terribly high hopes, though.) Working with graphics has spoiled me  so easy to find good resources on the net and in books. Sound seems a little less popular as far as documentation goes. :) TIA! .b 
From: Yee, Hector <hyee@we...>  20010716 17:55:28

What kind of acceleration structure are you using? If none, try regular grid. Are you using just shadow rays? If you are also doing indirect illumination you might want to try irradiance gradient interpolation (Ward and Heckbert, Eurographics 92) http://www.ri.cmu.edu/pubs/pub_1576_text.html 
From: Stephen J Baker <sjbaker@li...>  20010716 17:43:15

On Mon, 16 Jul 2001, Kamil Burzynski wrote: > NTSC? Never The Same Colors (again)? :) "Never Twice Same Colours". ...referring to the fact that the phase of the colour signal is encoded consistantly on every scanline. PAL stands for Phase Alternating Lines (or something like that)  so the phase of the colour burst signal is inverted on every alternate line. This means that if your TV's colour adjustment drifts a little, in NTSC, the colour of the entire screen drifts uniformly. With PAL, the colour will drift  but in opposite directions on alternate lines  so on average, the screen stays more or less the same colour. This is probably irrelevent with modern TV sets. Hence PAL's colours are generally considered better than NTSC. (But then we Brits go and screw it up by running it at only 50Hz).  Steve Baker (817)6192657 (Vox/VoxMail) L3Com/Link Simulation & Training (817)6192466 (Fax) Work: sjbaker@... http://www.link.com Home: sjbaker1@... http://web2.airmail.net/sjbaker1 
From: Emmanuel Astier <emmanuel.astier@wi...>  20010716 17:24:45

Perhaps you can cache the last occluding polygon for each light : When one lumel wasn't lit by a Light L1 because of a polygone P, you should try to test the intersection lumel<> light with P first when checking if the next lumel is lit. Hope it helps,=20 Emmanuel Original Message From: Steven Eckles [mailto:steve@...] Sent: lundi 16 juillet 2001 17:35 To: Graphics Algorithms Subject: [Algorithms] How can I speed up my lightmap generator? I have written a utility to generate lightmaps for my 3D engine, but, although it works, it's way too slow. E.g a scene with ~6000 polygons (mainly quads) and ~300 lights takes about 1.5 hours to generate the lightmaps on a pIII 500 (at 32x32 resolution). The exact same scene takes about 3 mins to compile from Worldcraft (via QRAD), so I must be doing something wrong! I have determined that it's the light<>pixel (lummel?) visibility calculation that's the real killer, as if I turn if off it takes < 1 min to compile the above level! I can speed it up by reducing the size of the lightmaps and/or only calculating every n pixels & interpolating, but both methods give noticeably poorer results. Can anybody suggest any other waays of speeding it up or a better method in the first place? The way it works at the moment it that every light has a list of polygons within it's range and every polygon has a list of lights that could potentially light it. Then for each lummel within each polygon, do a triangle<>ray intersection test for each polygon in the list of each light potentially visible to that polygon to determine it that lummel is lit by that light. I thought that using a BSP tree would speed things up but so for I've only managed to slow thing down! Is this the right way to go? if so, how? any other ideas? Steve. _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: Lee Sandberg <lee@ab...>  20010716 17:09:50

Well you have found a very good place to start to optimze I say. Can you just optimze some part of the "light<>pixel (lummel?) visibility calculation" (find out what in the calculation that takes the power. Maybe you can just test some pixels and if they are hit by the light and if so you do the rest? (Ofcourse some lights might not fit into the "sample") Lee Sandberg AB Colrod Media  Original Message  From: "Steven Eckles" <steve@...> To: "Graphics Algorithms" <gdalgorithmslist@...> Sent: Monday, July 16, 2001 5:35 PM Subject: [Algorithms] How can I speed up my lightmap generator? > I have written a utility to generate lightmaps for my 3D engine, but, > although it works, it's way too slow. E.g a scene with ~6000 polygons > (mainly quads) and ~300 lights takes about 1.5 hours to generate the > lightmaps on a pIII 500 (at 32x32 resolution). The exact same scene takes > about 3 mins to compile from Worldcraft (via QRAD), so I must be doing > something wrong! > > I have determined that it's the light<>pixel (lummel?) visibility > calculation that's the real killer, as if I turn if off it takes < 1 min to > compile the above level! I can speed it up by reducing the size of the > lightmaps and/or only calculating every n pixels & interpolating, but both > methods give noticeably poorer results. > > Can anybody suggest any other waays of speeding it up or a better method in > the first place? The way it works at the moment it that every light has a > list of polygons within it's range and every polygon has a list of lights > that could potentially light it. Then for each lummel within each polygon, > do a triangle<>ray intersection test for each polygon in the list of each > light potentially visible to that polygon to determine it that lummel is lit > by that light. > > I thought that using a BSP tree would speed things up but so for I've only > managed to slow thing down! Is this the right way to go? if so, how? any > other ideas? > > Steve. > > > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist > 
From: Martin Fuller <mfuller@ac...>  20010716 16:59:43

Managing / Optimising seems to me to be a problem specific to the platform and the function. Some of the questions I'd ask myself would be: 1. How much input data do I need, do I need to read and write a vertex, write the vertex or render and throw away? 2. What's the per vertex cost, can I perform any caching and will it be a win. Do I have enough memory, Lookup tables? 3. Can I utilise LOD? Not only in the ammount of geometry generated but the quality of the function / input data. 4. Does the geometry need to be recalculated every frame, is it faster to recalculate every frame anyway ;) 5. Is there any custom hardware on this platform that can help me out. 6. Do and I want to store my results for next frame and what memory / bus usage / processor selection implications does that have. 7. Is there a way to calculate the BV / store the maximum BV before generating the geometry and does this really matter. Cheers, Martin Original Message From: Kardamone [mailto:kardamone2@...] Sent: Monday, July 16, 2001 4:41 PM To: gdalgorithmslist@... Subject: Re: [Algorithms] Parametric Geometry I think I got the point, may be something like this : whatever geometrical law/eq/shape, it is called parametric when it's result may vary according to some input data (keeping it general, we end up definig "parametric" :). What's buggin' me now is the "variation" of it's result (I guess that is mainly what makes parametric stuff delicate to use) the question would be : how to manage/optimize this .. > I like Martin's description best. > > You could consider a "cube" to be a parametrically defined geometry. You > generate the triangles representing its faces given two parameters: 1) a > center coordinate; and 2) an edge length. (I don't consider a transformation > matrix a parameter.) > > Graham _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: Daniel Vogel <vogel@ep...>  20010716 16:57:05

> 2) The vegetation can be generated onthefly on entering the frustum, and > discarded when leaving it. A better way is to start generating it, say N > frames before it's expected to become visible. That way the cost of > generation is distributed over those N frames. The runtime overhead of placing stuff is neglectable for terrain. All you have to do is get the normal at a point on the terrain and then create a rotation matrix if you want to align to terrain. We simply have the artists create a density map for the terrain and then randomly (with a seed) place/ render vegetation depending on the density map. Then between min and max distance we either shrink objects to size 0 or fade them out using alpha. This approach works fine for small stuff like grass or ferns.  Daniel Vogel, Programmer, Epic Games Inc. 
From: Pierre Terdiman <p.terdiman@wa...>  20010716 16:43:10

> 1 "Additionally the Planetside engine generates a great deal of parametric > geometry on the fly, effectively creating an infinite level of detail." For Planetside, I guess there are two ways to understand this: 1) The detail mesh surrounding the player, adding details onthefly on the parts of the closest parts of the terrain. The way I see it, this is the same as a detail texture, except the details are real vertices & faces. For a terrain, this doesn't seem too difficult to do. For example, you could put a sphere around the player/camera, gather faces colliding with it, do some displacement mapping on them  with the height radially decreasing until beeing null on the surface of the sphere, etc. The actual displacement map would simply be a fractal noise or a Perlin turbulence, this has no importance as long as it gives the impression of a great deal of details. 2) The vegetation can be generated onthefly on entering the frustum, and discarded when leaving it. A better way is to start generating it, say N frames before it's expected to become visible. That way the cost of generation is distributed over those N frames. Of course I don't really know what's been done in Planetside, and John will have the final words about this. > 2 "Use static LOD solutions, or parametric geometry only if the 3d card directly > accelerates it." Say, DX8 Npatches. Not exactly compatible with what I wrote above, until some card can do displacement mapping in hardware. Note you can do a very fast displacement mapper for heightfieldbased terrains, since their topology is always the same  only the geometry changes. So precompute the subdivision once and for all, feed it with a positionindexed fractal turbulence, add this to the underlying base geometry, and it should work like the proverbial charm. A bit like what Charles Bloom wrote about precomputed subdivision surfaces [wasn't it patented (..) under the name "Valente subdivision" ? Not sure, maybe I'm mixing things up here]. > 3 "Use parametric geometry wisely. There's nothing wrong with generating > parametric geometry, I encourage it, but don't try to regenerate it every single > frame." cf the vegetation above, probably. LRUbased caches are simple & good as well. Pierre 
From: Martin Fuller <mfuller@ac...>  20010716 16:17:48

I think with parametric geometry you are sampling some function which can itself require data to produce an geometric approximation of that function's form. (Whether that involves projection or whatever it may be a 3D sample obtained from a 4D function, quaternion fractals etc..) I don't think variation is permitted without altering the parameters. Post processing of the data returned by the function aside for now. I'm still not convinced in my own mind, that anything involving noise can be called parametic unless a seed for a random number generator is specified, but then that's not really noise then, it's known static input which happens to be noise. A quick rummage through those texts which happen to be at my desk and not at home don't reveal any references to the phrase "parametric geometry" so this is all just as I understand it. Cheers, Martin Original Message From: Kardamone [mailto:kardamone2@...] Sent: Monday, July 16, 2001 4:41 PM To: gdalgorithmslist@... Subject: Re: [Algorithms] Parametric Geometry I think I got the point, may be something like this : whatever geometrical law/eq/shape, it is called parametric when it's result may vary according to some input data (keeping it general, we end up definig "parametric" :). What's buggin' me now is the "variation" of it's result (I guess that is mainly what makes parametric stuff delicate to use) the question would be : how to manage/optimize this .. > I like Martin's description best. > > You could consider a "cube" to be a parametrically defined geometry. You > generate the triangles representing its faces given two parameters: 1) a > center coordinate; and 2) an edge length. (I don't consider a transformation > matrix a parameter.) > > Graham _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: Kamil Burzynski <K.B<urzynski@ad...>  20010716 16:05:47

 Original Message  From: "Stephen J Baker" <sjbaker@...> To: <gdalgorithmslist@...> Sent: Monday, July 16, 2001 3:20 PM Subject: RE: [Algorithms] Illegal NTSC colours > On Fri, 13 Jul 2001, James Sutherland wrote: > > > > Just out of interest, how many people's tool chains detect > > > illegal NTSC > > > colours? Do you force artists to generate "NTSC legal" textures? > > > > If this is for a console game, then don't forget that people in nonNTSC > > regions can view the game in PAL or even true RGB, so ideally you wouldn't > > want to have your original artwork restricted to NTSC colours, as this > > washes out the picture needlessly for the rest of us. > > There are illegal colours in PAL and SECAM too. All colour TV systems use > colour difference signals  so they all end up being prone to > illegal colours. > > What I'm not sure of is whether the set of illegal colours is the > same in PAL as it is in NTSC and SECAM. Probably not. NTSC? Never The Same Colors (again)? :) Best regards from Kamil Burzynski ADB Poland, LTD.   "God, root, what's the difference?" 
From: Steven Eckles <steve@in...>  20010716 15:36:53

I have written a utility to generate lightmaps for my 3D engine, but, although it works, it's way too slow. E.g a scene with ~6000 polygons (mainly quads) and ~300 lights takes about 1.5 hours to generate the lightmaps on a pIII 500 (at 32x32 resolution). The exact same scene takes about 3 mins to compile from Worldcraft (via QRAD), so I must be doing something wrong! I have determined that it's the light<>pixel (lummel?) visibility calculation that's the real killer, as if I turn if off it takes < 1 min to compile the above level! I can speed it up by reducing the size of the lightmaps and/or only calculating every n pixels & interpolating, but both methods give noticeably poorer results. Can anybody suggest any other waays of speeding it up or a better method in the first place? The way it works at the moment it that every light has a list of polygons within it's range and every polygon has a list of lights that could potentially light it. Then for each lummel within each polygon, do a triangle<>ray intersection test for each polygon in the list of each light potentially visible to that polygon to determine it that lummel is lit by that light. I thought that using a BSP tree would speed things up but so for I've only managed to slow thing down! Is this the right way to go? if so, how? any other ideas? Steve. 
From: Kardamone <kardamone2@ya...>  20010716 15:27:37

I think I got the point, may be something like this : whatever geometrical law/eq/shape, it is called parametric when it's result may vary according to some input data (keeping it general, we end up definig "parametric" :). What's buggin' me now is the "variation" of it's result (I guess that is mainly what makes parametric stuff delicate to use) the question would be : how to manage/optimize this .. > I like Martin's description best. > > You could consider a "cube" to be a parametrically defined geometry. You > generate the triangles representing its faces given two parameters: 1) a > center coordinate; and 2) an edge length. (I don't consider a transformation > matrix a parameter.) > > Graham _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com 
From: Graham Rhodes <grhodes@se...>  20010716 14:46:51

I like Martin's description best. You could consider a "cube" to be a parametrically defined geometry. You generate the triangles representing its faces given two parameters: 1) a center coordinate; and 2) an edge length. (I don't consider a transformation matrix a parameter.) Graham > Original Message > From: gdalgorithmslistadmin@... > [mailto:gdalgorithmslistadmin@...]On Behalf Of > Martin Fuller > Sent: Monday, July 16, 2001 10:05 AM > To: 'gdalgorithmslist@...' > Subject: RE: [Algorithms] Parametric Geometry > > > My understanding of the phrase "Parametric geometry" would be any geometry > for which a position (and possibly more attributes, normal, > colour whatever) > can be obtained by passing parameters into a function. > > In this way subdivision surfaces and surface equations such a bezier, > bspline, NURBS can be thought of as parametric geometry. Typically the > parameters when plugged into the equation or function describe a > position on > a curve or surface or a position in a volume. > > Fractal geometry can also be consider parametric since it takes a > number of > parameters and describes a value which is translated into some geometric > space. > > Stochastic methods could be considered parametric as long as a > seed was one > of the parameters and the random number sequence was constant for > any given > seed??? > Cheers, > Martin > > > Original Message > From: Kardamone [mailto:kardamone2@...] > Sent: Monday, July 16, 2001 1:58 PM > To: gdalgorithmslist@... > Subject: [Algorithms] Parametric Geometry > > > In "Optimization Techniques for HWT&L pipelines" > > J. Ratcliff says : > > 1 "Additionally the Planetside engine generates a great deal of parametric > geometry on the fly, effectively creating an infinite level of detail." > > 2 "Use static LOD solutions, or parametric geometry only if the 3d card > directly > accelerates it." > > 3 "Use parametric geometry wisely. There's nothing wrong with generating > parametric geometry, I encourage it, but don't try to regenerate it every > single > frame." > > I'm affraid I don't understand what exactly is "Parametric > Geometry" someone > has > a definition in this case? > Maybe John? > > Thanks. > Kard. > > > > _________________________________________________________ > Do You Yahoo!? > Get your free @yahoo.com address at http://mail.yahoo.com > > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist > 
From: Martin Fuller <mfuller@ac...>  20010716 14:05:14

My understanding of the phrase "Parametric geometry" would be any geometry for which a position (and possibly more attributes, normal, colour whatever) can be obtained by passing parameters into a function. In this way subdivision surfaces and surface equations such a bezier, bspline, NURBS can be thought of as parametric geometry. Typically the parameters when plugged into the equation or function describe a position on a curve or surface or a position in a volume. Fractal geometry can also be consider parametric since it takes a number of parameters and describes a value which is translated into some geometric space. Stochastic methods could be considered parametric as long as a seed was one of the parameters and the random number sequence was constant for any given seed??? Cheers, Martin Original Message From: Kardamone [mailto:kardamone2@...] Sent: Monday, July 16, 2001 1:58 PM To: gdalgorithmslist@... Subject: [Algorithms] Parametric Geometry In "Optimization Techniques for HWT&L pipelines" J. Ratcliff says : 1 "Additionally the Planetside engine generates a great deal of parametric geometry on the fly, effectively creating an infinite level of detail." 2 "Use static LOD solutions, or parametric geometry only if the 3d card directly accelerates it." 3 "Use parametric geometry wisely. There's nothing wrong with generating parametric geometry, I encourage it, but don't try to regenerate it every single frame." I'm affraid I don't understand what exactly is "Parametric Geometry" someone has a definition in this case? Maybe John? Thanks. Kard. _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: <mpointie@ed...>  20010716 13:28:43

> I'm affraid I don't understand what exactly is "Parametric Geometry" someone has > a definition in this case? > Maybe John? I'm perhaps wrong on this one, but I'm usualy using this kind of vocabulary to describe all the things that can be generated using some algorithms that produce various kind of things depending of which parameters you gave as entry data. Typically it's the case for vegatation, like trees or fern than can be described using recursive generation. The parameters in that case is the max level of recursion, the "randomness", the "density", or even the building rules (like every two "trunk" parts add a "leaf" at left,... than can be described with some 'dna' looking format). You can also dynamically instanciate things (generated of hand made) using rules defining the probability you get some grass on rocky ground, and so on. Basically, with a simple pseudorandom generator, and some basic construction rules you should be able to generate a whole universe including planets, starts, desert area, atmosphere composition and so on :) [Now try to do an _interesting_ game with that is another matter] Hope I'm not wrong on that one :)) Mickael Pointier 
From: Kardamone <kardamone2@ya...>  20010716 13:27:29

Could this be some interface geometry (even if the function is not "math")? Is there some special optimisation techniques devoted to those entities? > Subdivision sufaces, bezier patches, NURB surfaces, you name it, > everything, that generate geometry by function. > _________________ > > In "Optimization Techniques for HWT&L pipelines" > > J. Ratcliff says : > > 1 "Additionally the Planetside engine generates a great deal of parametric > geometry on the fly, effectively creating an infinite level of detail." > > 2 "Use static LOD solutions, or parametric geometry only if the 3d card > directly > accelerates it." > > 3 "Use parametric geometry wisely. There's nothing wrong with generating > parametric geometry, I encourage it, but don't try to regenerate it every > single > frame." > > I'm affraid I don't understand what exactly is "Parametric Geometry" someone > has > a definition in this case? > Maybe John? > > Thanks. > Kard. _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com 
From: Stephen J Baker <sjbaker@li...>  20010716 13:17:31

On Fri, 13 Jul 2001, James Sutherland wrote: > > Just out of interest, how many people's tool chains detect > > illegal NTSC > > colours? Do you force artists to generate "NTSC legal" textures? > > If this is for a console game, then don't forget that people in nonNTSC > regions can view the game in PAL or even true RGB, so ideally you wouldn't > want to have your original artwork restricted to NTSC colours, as this > washes out the picture needlessly for the rest of us. There are illegal colours in PAL and SECAM too. All colour TV systems use colour difference signals  so they all end up being prone to illegal colours. What I'm not sure of is whether the set of illegal colours is the same in PAL as it is in NTSC and SECAM. Probably not.  Steve Baker (817)6192657 (Vox/VoxMail) L3Com/Link Simulation & Training (817)6192466 (Fax) Work: sjbaker@... http://www.link.com Home: sjbaker1@... http://web2.airmail.net/sjbaker1 
From: Timur Davidenko <T<imur@en...>  20010716 12:58:47

Subdivision sufaces, bezier patches, NURB surfaces, you name it, everything, that generate geometry by function. _________________ Timur Davidenko. http://www.enbaya.com timur@... Original Message From: Kardamone [mailto:kardamone2@...] Sent: Monday, July 16, 2001 2:58 PM To: gdalgorithmslist@... Subject: [Algorithms] Parametric Geometry In "Optimization Techniques for HWT&L pipelines" J. Ratcliff says : 1 "Additionally the Planetside engine generates a great deal of parametric geometry on the fly, effectively creating an infinite level of detail." 2 "Use static LOD solutions, or parametric geometry only if the 3d card directly accelerates it." 3 "Use parametric geometry wisely. There's nothing wrong with generating parametric geometry, I encourage it, but don't try to regenerate it every single frame." I'm affraid I don't understand what exactly is "Parametric Geometry" someone has a definition in this case? Maybe John? Thanks. Kard. _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 
From: Kardamone <kardamone2@ya...>  20010716 12:45:03

In "Optimization Techniques for HWT&L pipelines" J. Ratcliff says : 1 "Additionally the Planetside engine generates a great deal of parametric geometry on the fly, effectively creating an infinite level of detail." 2 "Use static LOD solutions, or parametric geometry only if the 3d card directly accelerates it." 3 "Use parametric geometry wisely. There's nothing wrong with generating parametric geometry, I encourage it, but don't try to regenerate it every single frame." I'm affraid I don't understand what exactly is "Parametric Geometry" someone has a definition in this case? Maybe John? Thanks. Kard. _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com 
From: Simon O'Connor <simon@cr...>  20010716 11:43:21

Flipped normals is only one of the problems, which is easily solved either by your method of using the original normal or comparing the S x T vector with the normal and adjusting its sign. The greater problem is shown in the following: c / \ / A \ a /_____\ b \ / \ B / \ / d Polygon A and B are mapped with the same texture, vertices a and b are shared by both polygons. Now imagine the texture v coordinate at vertex b is 0, and the v at vertex c is 1 and at vertex d is 1... ...Now calculate the texture space basis for vertex b... the problem is for polgon A the v mapping direction bc points in the opposite direction to bd of polygon B. The easiest solution seems to be to spot the problem when you're indexing the mesh (say at export time), and don't share the vertices which are affected. > Original Message > From: gdalgorithmslistadmin@... > [mailto:gdalgorithmslistadmin@...]On Behalf Of > Ignacio Castano > Sent: 15 July 2001 00:30 > To: gdalgorithmslist@... > Subject: RE: [Algorithms] texture space bump and mirrored textures > > > nee, don't use nvidia's method to generate the entiry basis. Use > it to generate only the tangent and > binormal vectors, take the original normal, and orthonormalize > those three vectos. That should give > you the appropiate basis. > > > > Ignacio Castano > castanyo@... > > > Jonathan Garrett wrote: > > are there any solutions to the issue of artist having used mirrored > > textures on a mesh which is to have bump mapping applied ? > > > > the Nvidia docs simply say not to allow texture mirroring but this > > is somewhat restrictive > > > > thanks > > Jonathan > > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/lists/listinfo/gdalgorithmslist 