You can subscribe to this list here.
2000 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}
(390) 
_{Aug}
(767) 
_{Sep}
(940) 
_{Oct}
(964) 
_{Nov}
(819) 
_{Dec}
(762) 

2001 
_{Jan}
(680) 
_{Feb}
(1075) 
_{Mar}
(954) 
_{Apr}
(595) 
_{May}
(725) 
_{Jun}
(868) 
_{Jul}
(678) 
_{Aug}
(785) 
_{Sep}
(410) 
_{Oct}
(395) 
_{Nov}
(374) 
_{Dec}
(419) 
2002 
_{Jan}
(699) 
_{Feb}
(501) 
_{Mar}
(311) 
_{Apr}
(334) 
_{May}
(501) 
_{Jun}
(507) 
_{Jul}
(441) 
_{Aug}
(395) 
_{Sep}
(540) 
_{Oct}
(416) 
_{Nov}
(369) 
_{Dec}
(373) 
2003 
_{Jan}
(514) 
_{Feb}
(488) 
_{Mar}
(396) 
_{Apr}
(624) 
_{May}
(590) 
_{Jun}
(562) 
_{Jul}
(546) 
_{Aug}
(463) 
_{Sep}
(389) 
_{Oct}
(399) 
_{Nov}
(333) 
_{Dec}
(449) 
2004 
_{Jan}
(317) 
_{Feb}
(395) 
_{Mar}
(136) 
_{Apr}
(338) 
_{May}
(488) 
_{Jun}
(306) 
_{Jul}
(266) 
_{Aug}
(424) 
_{Sep}
(502) 
_{Oct}
(170) 
_{Nov}
(170) 
_{Dec}
(134) 
2005 
_{Jan}
(249) 
_{Feb}
(109) 
_{Mar}
(119) 
_{Apr}
(282) 
_{May}
(82) 
_{Jun}
(113) 
_{Jul}
(56) 
_{Aug}
(160) 
_{Sep}
(89) 
_{Oct}
(98) 
_{Nov}
(237) 
_{Dec}
(297) 
2006 
_{Jan}
(151) 
_{Feb}
(250) 
_{Mar}
(222) 
_{Apr}
(147) 
_{May}
(266) 
_{Jun}
(313) 
_{Jul}
(367) 
_{Aug}
(135) 
_{Sep}
(108) 
_{Oct}
(110) 
_{Nov}
(220) 
_{Dec}
(47) 
2007 
_{Jan}
(133) 
_{Feb}
(144) 
_{Mar}
(247) 
_{Apr}
(191) 
_{May}
(191) 
_{Jun}
(171) 
_{Jul}
(160) 
_{Aug}
(51) 
_{Sep}
(125) 
_{Oct}
(115) 
_{Nov}
(78) 
_{Dec}
(67) 
2008 
_{Jan}
(165) 
_{Feb}
(37) 
_{Mar}
(130) 
_{Apr}
(111) 
_{May}
(91) 
_{Jun}
(142) 
_{Jul}
(54) 
_{Aug}
(104) 
_{Sep}
(89) 
_{Oct}
(87) 
_{Nov}
(44) 
_{Dec}
(54) 
2009 
_{Jan}
(283) 
_{Feb}
(113) 
_{Mar}
(154) 
_{Apr}
(395) 
_{May}
(62) 
_{Jun}
(48) 
_{Jul}
(52) 
_{Aug}
(54) 
_{Sep}
(131) 
_{Oct}
(29) 
_{Nov}
(32) 
_{Dec}
(37) 
2010 
_{Jan}
(34) 
_{Feb}
(36) 
_{Mar}
(40) 
_{Apr}
(23) 
_{May}
(38) 
_{Jun}
(34) 
_{Jul}
(36) 
_{Aug}
(27) 
_{Sep}
(9) 
_{Oct}
(18) 
_{Nov}
(25) 
_{Dec}

2011 
_{Jan}
(1) 
_{Feb}
(14) 
_{Mar}
(1) 
_{Apr}
(5) 
_{May}
(1) 
_{Jun}

_{Jul}

_{Aug}
(37) 
_{Sep}
(6) 
_{Oct}
(2) 
_{Nov}

_{Dec}

2012 
_{Jan}

_{Feb}
(7) 
_{Mar}

_{Apr}
(4) 
_{May}

_{Jun}
(3) 
_{Jul}

_{Aug}

_{Sep}
(1) 
_{Oct}

_{Nov}

_{Dec}
(10) 
2013 
_{Jan}

_{Feb}
(1) 
_{Mar}
(7) 
_{Apr}
(2) 
_{May}

_{Jun}

_{Jul}
(9) 
_{Aug}

_{Sep}

_{Oct}

_{Nov}

_{Dec}

2014 
_{Jan}
(14) 
_{Feb}

_{Mar}
(2) 
_{Apr}

_{May}
(10) 
_{Jun}

_{Jul}

_{Aug}

_{Sep}

_{Oct}

_{Nov}
(3) 
_{Dec}

2015 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}

_{Aug}

_{Sep}

_{Oct}
(12) 
_{Nov}

_{Dec}
(1) 
2016 
_{Jan}

_{Feb}
(1) 
_{Mar}
(1) 
_{Apr}
(1) 
_{May}

_{Jun}
(1) 
_{Jul}

_{Aug}
(1) 
_{Sep}

_{Oct}

_{Nov}

_{Dec}

S  M  T  W  T  F  S 

1
(1) 
2
(23) 
3
(20) 
4
(40) 
5
(23) 
6
(19) 
7

8
(2) 
9
(19) 
10
(19) 
11
(6) 
12
(40) 
13
(24) 
14
(14) 
15
(3) 
16
(19) 
17
(15) 
18
(17) 
19
(7) 
20
(13) 
21
(2) 
22

23
(10) 
24
(9) 
25
(9) 
26
(5) 
27
(15) 
28
(22) 
29
(5) 
30
(10) 
31
(13) 




From: Charles Bloom <cb2@cb...>  20040823 22:39:18

On further thought, it seems there are more axes needed, but I don't really have a grasp on how many. For example, it seems there's a test needed between the poly verts and the LSS end verts. 99% of the time these will be ruled out by some other test, but there're rare cases where you actually have to test along the axis between the closest poly vert and LSS endcap vert. At 12:27 PM 8/23/2004 0700, Charles Bloom wrote: >First of all, is there any prescription for finding the minimum set of >axes needed in a SAT test? It seems very add hoc. > >Secondly, I'm trying to a SAT test for convexplanarpoly (in 3d) vs. LSS >(LineSweptSphere, or Capsule). I think that the axes needed are : > >poly normal >poly edge perpendicular normals (edge direction cross poly normal) >LSS axis >LSS axis cross poly normal > >The other obvious axes are : > >poly edges cross LSS axis >poly edge perpendicular normals cross LSS axis > >but I don't think these are actually needed (?), but I have no idea how to >prove that. > > > >Charles Bloom email "cb" http://www.cbloom.com > > > > >SF.Net email is sponsored by Shop4tech.comLowest price on Blank Media >100pk Sonic DVDR 4x for only $29 100pk Sonic DVD+R for only $33 >Save 50% off Retail on Ink & Toner  Free Shipping and Free Gift. >http://www.shop4tech.com/z/Inkjet_Cartridges/9_108_r285 >_______________________________________________ >GDAlgorithmslist mailing list >GDAlgorithmslist@... >https://lists.sourceforge.net/lists/listinfo/gdalgorithmslist >Archives: >http://sourceforge.net/mailarchive/forum.php?forum_id=6188  Charles Bloom email "cb" http://www.cbloom.com 
From: Per Vognsen <Per.Vognsen@ep...>  20040823 21:39:52

> Original Message > From: gdalgorithmslistadmin@... [mailto:gdalgorithms > listadmin@...] On Behalf Of Charles Bloom > Sent: Monday, August 23, 2004 3:28 PM > To: gdalgorithmslist@... > Subject: [Algorithms] SAT for 3d convexpoly vs. LSS ? >=20 >=20 > First of all, is there any prescription for finding the minimum set of > axes > needed in a SAT test? It seems very add hoc. Start with a separating axis for every combination of two features and trim the list from there. For a lineswept sphere the features would be the main body (without boundary), corresponding to the interior of the LSS segment, and two hemispherical caps (without boundary), corresponding to the segment end points. For a convex polygon, the features are the face interior, the edge interiors and the vertices. For each combination of features, the separating axis in this initial set should be the shortest axis joining the two features. I was very careful about specifying which of the above features included their boundary and which didn't. This is extremely important when finding the shortest axis joining features. For instance, suppose we consider the combination of a vertex and an edge interior. The edge interior may fail to contain a point that is closest to this vertex. This happens exactly when the projection of the vertex onto the infinite line containing the edge interior does not fall within the edge interior itself. But *if* there is a closest point it will be the projection onto the infinite line segment. So the separating axis for this pair should be the axis that goes through the vertex and is orthogonal to the infinite line containing the edge interior. With that said, let's proceed to finding the initial set of separating axes for your case. Vertex vs cap: The axis through the cap center and the vertex. Vertex vs body: The axis that goes through the vertex and is orthogonal to the infinite line through the LSS line segment. Edge vs cap: Like vertex vs body but with the roles reversed. Edge vs body: The axis is the cross product of the edge vector and the LSS segment vector. Face vs anything: The face normal. (Since the face itself determines the separating axis uniquely.) Now we can think about trimming down this list. This is where some ingenuity is required. As a heuristic, good candidates for trimming often include the vertex and edge cases. I can't really see how it is possible to collapse the vertex cases to the corresponding edge cases for this problem but you can at least do some early outs. For instance, let's try to do that for checking an edge and its end vertices against a cap. First project the cap center onto the infinite line containing the edge. If it falls inside the edge, we know that the distance from the cap center to the end vertices must be greater than the distance from the cap center to the projected cap center so we only have to test the edge vs cap axis, not the vertex vs cap axis. I guess I didn't really answer your question, Charles, but maybe this gives you some basic ideas of how to proceed. Maybe other people on the list (Alen? Paul?) know of more systematic procedures? (For what it's worth, I started using my approach a couple of years ago when I was thinking of how to prove the correctness of the separating axis method. A natural way to do this is to perform a case analysis on the possible closest feature pairs, which motivates the described procedure.) Cheers, Per 
From: aick <inderau@gm...>  20040823 21:39:37

Well I think, without further constraints your problem will not be solvable analytical. One solution could be: vy = (h1(g/2)*(s1/vx)^2)*(vx/s1) where s1 is the point, where your ball hits the ground. we could assume that s1 depends on restitution k and s (I'm not sure if this is physically correct): s = s1+k*s1 where we get: s1 = s/(1+k) ok, let's start how we got there: constant accel. motion is: s = a/2t^2 + v0 + s0 (1) constant motion is: s = v*t (2) for downward motion we use (1), for foreward motion (2) so we get: 0 = (g/2)*t^2 + vy*t  h1 solving to vy and using (2) to calculate t leads to the result formula. if we try to calculating s1 we would get two formulas since we solve an quadratic polynominal: (one for where the ball gets up and for where the ball goes down) s1 = s1/2*(2*k*vy+2*sqrt(k^2*vy^2+2*g*h2))*vx/g (3) s1 = s1/2*(2*k*vy2*sqrt(k^2*vy^2+2*g*h2))*vx/g (4) and here we see the problem: s1 depends in both on vy! if we would insert our original formula in (3) or (4) then we could not solving this without complex numbers and so on. so i think our assumption for s1 would be just fine. maybe someone else finds a better solution; I'm not a math expert. cheers, aick 
From: Andrew Finkenstadt <andyf@si...>  20040823 20:55:05

Is there any accounting for drag through the medium that would impede forward progress (delta Vx, delta Vy)? Must there be a bounce? Can there be more than one bounce to reach h2 at x? Is the colission perfectly elastic, or does the 'k' coefficient account for its inelasticity? Is the ball treated as a point, or does it have actual width with which to collide with the floor? Original Message From: gdalgorithmslistadmin@... [mailto:gdalgorithmslistadmin@...]On Behalf Of Zafar Qamar Sent: Monday, August 23, 2004 12:07 PM To: gdalgorithmslist@... Subject: [Algorithms] Simple Projectile Motion Dear All, Can anyone solve this  it seemed easy at first but maybe I'm going wrong somewhere in my approach? Referring to the small attached pic: I wish to throw a ball from height h1 with velocity (Vx,Vy) (Vx being a given constant) such that the Yvelocity bounces with a coefficient of restitution k (around 0.8) and manages to reach height h2 after travelling a total distance of s So, the constants are Vx,s,h1,h2 and g (gravity) and the only variable is Vy. Can anyone calculate Vy? Many thanks in advance, Zafar Qamar Swordfish Studios __________________________________________________________ This mail has been scanned for all known viruses by UUNET delivered through the MessageLabs Virus Control Centre. 
From: Charles Bloom <cb2@cb...>  20040823 19:33:29

First of all, is there any prescription for finding the minimum set of axes needed in a SAT test? It seems very add hoc. Secondly, I'm trying to a SAT test for convexplanarpoly (in 3d) vs. LSS (LineSweptSphere, or Capsule). I think that the axes needed are : poly normal poly edge perpendicular normals (edge direction cross poly normal) LSS axis LSS axis cross poly normal The other obvious axes are : poly edges cross LSS axis poly edge perpendicular normals cross LSS axis but I don't think these are actually needed (?), but I have no idea how to prove that.  Charles Bloom email "cb" http://www.cbloom.com 
From: Greg Bakker <GregB@cs...>  20040823 19:17:25

It depends on the situation. For a toolchain or PC application, Judy arrays make a lot of sense. In a limited memory environment you need to be careful about when and from where you do your allocations. It simplifies matters to statically engineer them such that the index tables may live in a fixedsize scope. So in that case the single 'malloc' inside the Judy library could trip you up, and a fixedsize hash table may work out better. (disclaimer  I lack Richard's sense of adventure, and haven't tried implementing my own version of Judy for fun. Good to see the technique validated independantly though.) Greg > Original Message > From: Richard Fabian [mailto:algorithms@...] > Sent: 23 August 2004 16:39 > To: gdalgorithmslist@... > Subject: RE: [Algorithms] Hashing arbitary data/pointers > > > Judy arrays. > > http://judy.sourceforge.net/ > > > > > After having read the long doc on this tech, i took it upon > myself as a > programming excersize to write this functionality into a template... > > After i got it running, i was deadly impressed, especially as it was > faster than a hash table for worst case, and was about the > same as hash > for best case. > > My implementation does not handle null or single nodes, nor !=4 deep > trees, so it uses more memory for small size tables.... but i > have used > it lots of places (its my current hammer). When i needed to know what > values were associated with certain pointers (large platform dependant > range) i just use one of these. its VERY fast... doesn't > cache miss much > either. its quite small (for a completely open 32bit key pair system), > and its quite quick to write... (only took me a week to get my dodgy > version running)... just implement the parts you particularly > like about > it. > > >  > On the other hand > I may be talking > complete drivel. >  
From: Zafar Qamar <zafar.qamar@sw...>  20040823 17:03:30

Dear All, Can anyone solve this  it seemed easy at first but maybe I'm going wrong somewhere in my approach? Referring to the small attached pic: I wish to throw a ball from height h1 with velocity (Vx,Vy) (Vx being a given constant) such that the Yvelocity bounces with a coefficient of restitution k (around 0.8) and manages to reach height h2 after travelling a total distance of s So, the constants are Vx,s,h1,h2 and g (gravity) and the only variable is Vy. Can anyone calculate Vy? Many thanks in advance, Zafar Qamar Swordfish Studios __________________________________________________________ This mail has been scanned for all known viruses by UUNET delivered through the MessageLabs Virus Control Centre. 
From: Jonathan Blow <jon@nu...>  20040823 16:29:48

> Indeed  this is essentially a version of Jon Blow's predicate > database from GD mag, extended for general property storage. It's > turned out to be immensely handy for organising all sorts of things in > the perl mockups I've been playingg with, so I'm trying to see if I > can work it into our game engine. What I tend to do for this stuff is essentially what Jon Watte posted. Except that I don't like C++ inheritance all that much, so if there's a small set of welldefined types that I won't be expanding very often, I will do the hash generation in a more compact and hardcoded way. But it's basically the same thing. 
From: Richard Fabian <algorithms@th...>  20040823 15:39:43

Judy arrays. http://judy.sourceforge.net/ After having read the long doc on this tech, i took it upon myself as a programming excersize to write this functionality into a template... After i got it running, i was deadly impressed, especially as it was faster than a hash table for worst case, and was about the same as hash for best case. My implementation does not handle null or single nodes, nor !=4 deep trees, so it uses more memory for small size tables.... but i have used it lots of places (its my current hammer). When i needed to know what values were associated with certain pointers (large platform dependant range) i just use one of these. its VERY fast... doesn't cache miss much either. its quite small (for a completely open 32bit key pair system), and its quite quick to write... (only took me a week to get my dodgy version running)... just implement the parts you particularly like about it.  On the other hand I may be talking complete drivel.  > Original Message > From: gdalgorithmslistadmin@... > [mailto:gdalgorithmslistadmin@...] On > Behalf Of cruise > Sent: 20 August 2004 04:10 PM > To: gdalgorithmslist@... > Subject: [Algorithms] Hashing arbitary data/pointers > > > I have a collection of key/value pairs, and naturally it > makes sense to have > some kind of hash (or other kind of quick lookup) on they key >  but the key can > be any kind of data  string, int, float, pointer to a class, > etc. This lookup will be used /a lot/ so the faster the > better. My first thought is > store the data as a void*, and do some kind of octree or hash > on the pointer > value. Does this sound tractable, or am I attempting > something completely stupid > with this? > >  > [ cruise / casualtempest.net / transference.org ] > "quantam sufficit" > > >  > SF.Net email is sponsored by Shop4tech.comLowest price on > Blank Media 100pk Sonic DVDR 4x for only $29 100pk Sonic > DVD+R for only $33 Save 50% off Retail on Ink & Toner  Free > Shipping and Free Gift. > http://www.shop4tech.com/z/Inkjet_Cartridges/9>; _108_r285 > > _______________________________________________ > > GDAlgorithmslist mailing list GDAlgorithmslist@... > https://lists.sourceforge.net/lists/listinfo/gdalgorithmslist > Archives: http://sourceforge.net/mailarchive/forum.php?forum_id=6188 > >  > Incoming mail is certified Virus Free. > Checked by AVG antivirus system (http://www.grisoft.com). > Version: 6.0.742 / Virus Database: 495  Release Date: 19/08/2004 > >  Outgoing mail is certified Virus Free. Checked by AVG antivirus system (http://www.grisoft.com). Version: 6.0.742 / Virus Database: 495  Release Date: 19/08/2004 
From: <Simon_Brown@sc...>  20040823 08:22:42

Reducing to a linear segments can sometime save memory (for the same error tolerance), so having a scheme that can choose between the two at export time may well be worth implementing. If you already have a curve fitting implementation then it should be straightforward to swap out the cubic solver for a linear one... ;) Otherwise it's all compression over the top of your existing data (for which zlib over quantised data is probably very sensible). As Mark mentions, if you're using any sort of funky polynomial coefficients you will want to change to something like Bezier ones where you're expressing local values and tangents, since they will quantise much more effectively due to the range constraints. Ta, Simon Simon Brown Sony Computer Entertainment Europe http://www.scee.com gdalgorithmslistadmin@... wrote on 20/08/2004 22:51:55: > Any ideas on animation compression? > > We currently use segmented polynomial approximations (think its cubic at > the moment) to each channel of animation. But still its not enough, we > have massive numbers of animations (500+ for the main character 'normal' > movement) that even with fairly good curve fitting is taking up much > more space than I'm happy with (currently over 20+Mb for the main > characters animations). > > There are some obvious things we can do (using simpler polynomial > approximations (e.g. linear) for some animation segments) and perhaps > keeping a zlib based cache of recently used ones but I'm hoping there is > something else. > > We don't yet quantise the polynomial coefficients, simply because the > only thing I remember from approximation theory class is that polynomial > approximation is extremely illconditioned with coefficients. And I'm a > coward to try :( > > Any ideas great fully received, even if its just buy a bigger stick to > bash the animators and designers with. > > Bye, > Deano > > >  > SF.Net email is sponsored by Shop4tech.comLowest price on Blank Media > 100pk Sonic DVDR 4x for only $29 100pk Sonic DVD+R for only $33 > Save 50% off Retail on Ink & Toner  Free Shipping and Free Gift. > http://www.shop4tech.com/z/Inkjet_Cartridges/9_108_r285 > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > https://lists.sourceforge.net/lists/listinfo/gdalgorithmslist > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=6188 > ForwardSourceID:NT00005472 ********************************************************************** This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify postmaster@... This footnote also confirms that this email message has been checked for all known viruses. ********************************************************************** SCEE 2004 