From: Jason Zisk <ziskj@n...>  20001102 19:05:52

To be absolutely clear. :) I use the same 4x4 matrix I send D3D, the "world matrix". I would call this the local to world matrix. Transpose that, multiply my direction vector by it and it works great. Terminology sucks. :)  Jason Zisk  nFusion Interactive LLC  Original Message  From: "Zhang Zhong Shan" <ZhangZhongShan@...> To: <gdalgorithmslist@...> Sent: Wednesday, November 01, 2000 9:03 PM Subject: RE: [Algorithms] Correct Way to transform direction vector > are you using the transpose of worldlocal or the transpose of localworld? > > i think its the later that you are using, the localworld is the revsers of > worldlocal, so you are accturely applying the rule without noticing. > > this cant be wrong, I can bet my left hand on this. ^_^ > > > zhongshan > > > Original Message > From: Jason Zisk [mailto:ziskj@...] > Sent: Thursday, November 02, 2000 1:26 AM > To: gdalgorithmslist@... > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > This doesn't work, at least it didn't for me. I had to use the transpose of > the matrix that transforms the vertex, no inverse at all. > > If I use the inverse transpose (or transpose inverse) of the worldlocal > matrix it would do position and scaling correctly but the rotation would be > all wrong. By not inverting the matrix all would work. > > I'm working with full 4x4 transformation matrices, the same ones you'd send > to D3D or OGL to tell it what to render. > >  Jason Zisk >  nFusion Interactive LLC > > >  Original Message  > From: "Zhang Zhong Shan" <ZhangZhongShan@...> > To: <gdalgorithmslist@...> > Sent: Wednesday, November 01, 2000 3:05 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > hi jason, > > > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > > THE MATRIX THAT TRANSFORMS VERTEX. > > > > In your case, you should use the transpose of inversed worldlocal matrix, > > which is exactly the transpose of your localworld matrix. > > > > btw, getting normals by first transforming the 2 vertices does not work > when > > there is reflection in transformation. > > > > > > > > > > > > Original Message > > From: Jason Zisk [mailto:ziskj@...] > > Sent: Wednesday, November 01, 2000 3:41 AM > > To: Algorithms List > > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > > > > Well I know for sure that transforming 2 points doesn't work if the matrix > > contains scaling. I'm not entirely sure why, I thought I had it figured > out > > but you are right, the change in direction is what you want. All I know > is > > that when I do this, if the matrix has scaling its pointing in the wrong > > direction. > > > > I found out that to go from world space to object space you don't want to > > multiply the dir. vector by transpose(inverse), you just want to use just > > transpose(matrix). > > > > This makes sense since I believe transforming normals was in reference to > > going from object space to world space. > > > > So the final outcome is, if you want to transform a direction vector from > > one space to another, just multiply it by the transpose of the > > transformation matrix. Now if I could remember enough math to know why. > :) > > > >  Jason Zisk > >  nFusion Interactive LLC > > > > > >  Original Message  > > From: "Peter Warden" <Peter.Warden@...> > > To: <gdalgorithmslist@...> > > Sent: Tuesday, October 31, 2000 9:21 AM > > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > > > > Jason Zisk wrote: > > > > Hey everyone. I'm having a problem transforming a > > > > direction vector from one coordinate space to another. I > > > > need to transform the direction of the ray from world space > > > > to object space so I can do an intersection on the triangles > > > > in a mesh. > > > > > > > > I've tried two things. The first was taking two points on > > > > the line that the > > > > direction vector forms, transforming those by the inverse > > > > matrix of the > > > > object I'm trying to intersect with then recreating the > > > > vector from those > > > > two transformed points. That has problems if you have > > > > scaling in the matrix > > > > though, the direction of the vector could change. So this > > > > solution is no > > > > good in my situation. > > > > > > Surely in this case you _want_ the direction to change if > > > there's scaling. As a thought experiment in 2D, imagine you had > > > a square centred on the origin with corners at (1,1),(1,1), > > > (1,1) and (1,1). Now apply a localtoworld transform to take > > > this shape into worldspace, apply a scaling of x=x*2. This > > > leaves the corners at (2,1),(2,1),(2,1) and (2,1). Now, put > > > a ray into world space that starts at (0,2), and has a direction > > > of (2,1). This ray will touch the (2,1) corner of the square > > > in worldspace. The inverse of the localtoworld transform for > > > the square is x=x/2, and if we apply this to both the ray's > > > origin and to its direction vector, we end up with a ray at > > > (0,2) with a direction of (1,1) in the square's local space. > > > This ray still kisses the same corner of the square, in local > > > space coordinates (1,1). If the direction _hadn't_ been > > > affected by the scaling, the ray in local space would miss the > > > square, which isn't what you want! > > > > > > I'd say transforming the two points by the inverse matrix was > > > the right way to tackle this, the alteration of the direction by > > > scaling is needed in this case. > > > > > > > I looked back at the archives of this list and I noticed a > > > > discussion of > > > > transforming normals. It seems that using the transpose of > > > > the inverted > > > > transformation matrix is the right way to transform a normal. > > > > By doing this > > > > I actually solved some problems (with scaling) but caused others with > > > > rotation. > > > > > > Surface normals are a different case, but I can't come up with > > > an explanation as to why that I'm happy with! The closest I've > > > come is that the normal is part of a plane definition, and so > > > need the rules of plane transformation applied to it, whereas a > > > ray's direction vector isn't and doesn't. Implicit versus > > > explicit representations? An authorative answer from a maths bod > > > would be nice... > > > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > > Haines at > > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > > , and the response from David Rogers a few issues later? > > > > > > Peter Warden > > > _______________________________________________ > > > GDAlgorithmslist mailing list > > > GDAlgorithmslist@... > > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithmslist > > > > > > _______________________________________________ > > GDAlgorithmslist mailing list > > GDAlgorithmslist@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithmslist > > _______________________________________________ > > GDAlgorithmslist mailing list > > GDAlgorithmslist@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithmslist > > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithmslist > _______________________________________________ > GDAlgorithmslist mailing list > GDAlgorithmslist@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithmslist 