## RE: [Algorithms] Correct Way to transform direction vector

 RE: [Algorithms] Correct Way to transform direction vector From: Zhang Zhong Shan - 2000-11-02 02:02:09 ```are you using the transpose of world-local or the transpose of local-world? i think its the later that you are using, the local-world is the revsers of world-local, so you are accturely applying the rule without noticing. this cant be wrong, I can bet my left hand on this. ^_^ zhongshan -----Original Message----- From: Jason Zisk [mailto:ziskj@...] Sent: Thursday, November 02, 2000 1:26 AM To: gdalgorithms-list@... Subject: Re: [Algorithms] Correct Way to transform direction vector This doesn't work, at least it didn't for me. I had to use the transpose of the matrix that transforms the vertex, no inverse at all. If I use the inverse transpose (or transpose inverse) of the world-local matrix it would do position and scaling correctly but the rotation would be all wrong. By not inverting the matrix all would work. I'm working with full 4x4 transformation matrices, the same ones you'd send to D3D or OGL to tell it what to render. - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Zhang Zhong Shan" To: Sent: Wednesday, November 01, 2000 3:05 AM Subject: RE: [Algorithms] Correct Way to transform direction vector > hi jason, > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > THE MATRIX THAT TRANSFORMS VERTEX. > > In your case, you should use the transpose of inversed world-local matrix, > which is exactly the transpose of your local-world matrix. > > btw, getting normals by first transforming the 2 vertices does not work when > there is reflection in transformation. > > > > > > -----Original Message----- > From: Jason Zisk [mailto:ziskj@...] > Sent: Wednesday, November 01, 2000 3:41 AM > To: Algorithms List > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > Well I know for sure that transforming 2 points doesn't work if the matrix > contains scaling. I'm not entirely sure why, I thought I had it figured out > but you are right, the change in direction is what you want. All I know is > that when I do this, if the matrix has scaling its pointing in the wrong > direction. > > I found out that to go from world space to object space you don't want to > multiply the dir. vector by transpose(inverse), you just want to use just > transpose(matrix). > > This makes sense since I believe transforming normals was in reference to > going from object space to world space. > > So the final outcome is, if you want to transform a direction vector from > one space to another, just multiply it by the transpose of the > transformation matrix. Now if I could remember enough math to know why. :) > > - Jason Zisk > - nFusion Interactive LLC > > > ----- Original Message ----- > From: "Peter Warden" > To: > Sent: Tuesday, October 31, 2000 9:21 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > Jason Zisk wrote: > > > Hey everyone. I'm having a problem transforming a > > > direction vector from one coordinate space to another. I > > > need to transform the direction of the ray from world space > > > to object space so I can do an intersection on the triangles > > > in a mesh. > > > > > > I've tried two things. The first was taking two points on > > > the line that the > > > direction vector forms, transforming those by the inverse > > > matrix of the > > > object I'm trying to intersect with then recreating the > > > vector from those > > > two transformed points. That has problems if you have > > > scaling in the matrix > > > though, the direction of the vector could change. So this > > > solution is no > > > good in my situation. > > > > Surely in this case you _want_ the direction to change if > > there's scaling. As a thought experiment in 2D, imagine you had > > a square centred on the origin with corners at (-1,-1),(1,-1), > > (1,1) and (-1,1). Now apply a local-to-world transform to take > > this shape into world-space, apply a scaling of x=x*2. This > > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > > a ray into world space that starts at (0,2), and has a direction > > of (-2,-1). This ray will touch the (-2,1) corner of the square > > in world-space. The inverse of the local-to-world transform for > > the square is x=x/2, and if we apply this to both the ray's > > origin and to its direction vector, we end up with a ray at > > (0,2) with a direction of (-1,-1) in the square's local space. > > This ray still kisses the same corner of the square, in local > > space coordinates (-1,1). If the direction _hadn't_ been > > affected by the scaling, the ray in local space would miss the > > square, which isn't what you want! > > > > I'd say transforming the two points by the inverse matrix was > > the right way to tackle this, the alteration of the direction by > > scaling is needed in this case. > > > > > I looked back at the archives of this list and I noticed a > > > discussion of > > > transforming normals. It seems that using the transpose of > > > the inverted > > > transformation matrix is the right way to transform a normal. > > > By doing this > > > I actually solved some problems (with scaling) but caused others with > > > rotation. > > > > Surface normals are a different case, but I can't come up with > > an explanation as to why that I'm happy with! The closest I've > > come is that the normal is part of a plane definition, and so > > need the rules of plane transformation applied to it, whereas a > > ray's direction vector isn't and doesn't. Implicit versus > > explicit representations? An authorative answer from a maths bod > > would be nice... > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > Haines at > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > , and the response from David Rogers a few issues later? > > > > Peter Warden > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list _______________________________________________ GDAlgorithms-list mailing list GDAlgorithms-list@... http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```

 RE: [Algorithms] Correct Way to transform direction vector From: Peter Warden - 2000-10-31 14:17:12 ```Jason Zisk wrote: > Hey everyone. I'm having a problem transforming a > direction vector from one coordinate space to another. I > need to transform the direction of the ray from world space > to object space so I can do an intersection on the triangles > in a mesh. > > I've tried two things. The first was taking two points on > the line that the > direction vector forms, transforming those by the inverse > matrix of the > object I'm trying to intersect with then recreating the > vector from those > two transformed points. That has problems if you have > scaling in the matrix > though, the direction of the vector could change. So this > solution is no > good in my situation. Surely in this case you _want_ the direction to change if there's scaling. As a thought experiment in 2D, imagine you had a square centred on the origin with corners at (-1,-1),(1,-1), (1,1) and (-1,1). Now apply a local-to-world transform to take this shape into world-space, apply a scaling of x=x*2. This leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put a ray into world space that starts at (0,2), and has a direction of (-2,-1). This ray will touch the (-2,1) corner of the square in world-space. The inverse of the local-to-world transform for the square is x=x/2, and if we apply this to both the ray's origin and to its direction vector, we end up with a ray at (0,2) with a direction of (-1,-1) in the square's local space. This ray still kisses the same corner of the square, in local space coordinates (-1,1). If the direction _hadn't_ been affected by the scaling, the ray in local space would miss the square, which isn't what you want! I'd say transforming the two points by the inverse matrix was the right way to tackle this, the alteration of the direction by scaling is needed in this case. > I looked back at the archives of this list and I noticed a > discussion of > transforming normals. It seems that using the transpose of > the inverted > transformation matrix is the right way to transform a normal. > By doing this > I actually solved some problems (with scaling) but caused others with > rotation. Surface normals are a different case, but I can't come up with an explanation as to why that I'm happy with! The closest I've come is that the normal is part of a plane definition, and so need the rules of plane transformation applied to it, whereas a ray's direction vector isn't and doesn't. Implicit versus explicit representations? An authorative answer from a maths bod would be nice... I take it you've seen the 'Abnormal Normals' article by Eric Haines at http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html , and the response from David Rogers a few issues later? Peter Warden ```
 RE: [Algorithms] Correct Way to transform direction vector From: Zhang Zhong Shan - 2000-11-01 08:04:29 ```hi jason, The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF THE MATRIX THAT TRANSFORMS VERTEX. In your case, you should use the transpose of inversed world-local matrix, which is exactly the transpose of your local-world matrix. btw, getting normals by first transforming the 2 vertices does not work when there is reflection in transformation. -----Original Message----- From: Jason Zisk [mailto:ziskj@...] Sent: Wednesday, November 01, 2000 3:41 AM To: Algorithms List Subject: Re: [Algorithms] Correct Way to transform direction vector Well I know for sure that transforming 2 points doesn't work if the matrix contains scaling. I'm not entirely sure why, I thought I had it figured out but you are right, the change in direction is what you want. All I know is that when I do this, if the matrix has scaling its pointing in the wrong direction. I found out that to go from world space to object space you don't want to multiply the dir. vector by transpose(inverse), you just want to use just transpose(matrix). This makes sense since I believe transforming normals was in reference to going from object space to world space. So the final outcome is, if you want to transform a direction vector from one space to another, just multiply it by the transpose of the transformation matrix. Now if I could remember enough math to know why. :) - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Peter Warden" To: Sent: Tuesday, October 31, 2000 9:21 AM Subject: RE: [Algorithms] Correct Way to transform direction vector > Jason Zisk wrote: > > Hey everyone. I'm having a problem transforming a > > direction vector from one coordinate space to another. I > > need to transform the direction of the ray from world space > > to object space so I can do an intersection on the triangles > > in a mesh. > > > > I've tried two things. The first was taking two points on > > the line that the > > direction vector forms, transforming those by the inverse > > matrix of the > > object I'm trying to intersect with then recreating the > > vector from those > > two transformed points. That has problems if you have > > scaling in the matrix > > though, the direction of the vector could change. So this > > solution is no > > good in my situation. > > Surely in this case you _want_ the direction to change if > there's scaling. As a thought experiment in 2D, imagine you had > a square centred on the origin with corners at (-1,-1),(1,-1), > (1,1) and (-1,1). Now apply a local-to-world transform to take > this shape into world-space, apply a scaling of x=x*2. This > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > a ray into world space that starts at (0,2), and has a direction > of (-2,-1). This ray will touch the (-2,1) corner of the square > in world-space. The inverse of the local-to-world transform for > the square is x=x/2, and if we apply this to both the ray's > origin and to its direction vector, we end up with a ray at > (0,2) with a direction of (-1,-1) in the square's local space. > This ray still kisses the same corner of the square, in local > space coordinates (-1,1). If the direction _hadn't_ been > affected by the scaling, the ray in local space would miss the > square, which isn't what you want! > > I'd say transforming the two points by the inverse matrix was > the right way to tackle this, the alteration of the direction by > scaling is needed in this case. > > > I looked back at the archives of this list and I noticed a > > discussion of > > transforming normals. It seems that using the transpose of > > the inverted > > transformation matrix is the right way to transform a normal. > > By doing this > > I actually solved some problems (with scaling) but caused others with > > rotation. > > Surface normals are a different case, but I can't come up with > an explanation as to why that I'm happy with! The closest I've > come is that the normal is part of a plane definition, and so > need the rules of plane transformation applied to it, whereas a > ray's direction vector isn't and doesn't. Implicit versus > explicit representations? An authorative answer from a maths bod > would be nice... > > I take it you've seen the 'Abnormal Normals' article by Eric > Haines at > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > , and the response from David Rogers a few issues later? > > Peter Warden > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list _______________________________________________ GDAlgorithms-list mailing list GDAlgorithms-list@... http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 Re: [Algorithms] Correct Way to transform direction vector From: Neal Tringham - 2000-11-01 11:28:05 ```From: Zhang Zhong Shan > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > THE MATRIX THAT TRANSFORMS VERTEX. > > In your case, you should use the transpose of inversed world-local matrix, > which is exactly the transpose of your local-world matrix. Presumably this is only valid when the transpose operation is equivalent to the inverse, i.e. when the matrix does not contain scaling / mirroring information? Although I suspect its obvious, I suppose it might also be worth checking that we're talking about the 3x3 matrices here, i.e. the translation component of the 4x4 is not necessary. Neal Tringham (Sick Puppies / Empire Interactive) neal@... neal@... ```
 Re: [Algorithms] Correct Way to transform direction vector From: Jason Zisk - 2000-11-01 17:25:13 ```This doesn't work, at least it didn't for me. I had to use the transpose of the matrix that transforms the vertex, no inverse at all. If I use the inverse transpose (or transpose inverse) of the world-local matrix it would do position and scaling correctly but the rotation would be all wrong. By not inverting the matrix all would work. I'm working with full 4x4 transformation matrices, the same ones you'd send to D3D or OGL to tell it what to render. - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Zhang Zhong Shan" To: Sent: Wednesday, November 01, 2000 3:05 AM Subject: RE: [Algorithms] Correct Way to transform direction vector > hi jason, > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > THE MATRIX THAT TRANSFORMS VERTEX. > > In your case, you should use the transpose of inversed world-local matrix, > which is exactly the transpose of your local-world matrix. > > btw, getting normals by first transforming the 2 vertices does not work when > there is reflection in transformation. > > > > > > -----Original Message----- > From: Jason Zisk [mailto:ziskj@...] > Sent: Wednesday, November 01, 2000 3:41 AM > To: Algorithms List > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > Well I know for sure that transforming 2 points doesn't work if the matrix > contains scaling. I'm not entirely sure why, I thought I had it figured out > but you are right, the change in direction is what you want. All I know is > that when I do this, if the matrix has scaling its pointing in the wrong > direction. > > I found out that to go from world space to object space you don't want to > multiply the dir. vector by transpose(inverse), you just want to use just > transpose(matrix). > > This makes sense since I believe transforming normals was in reference to > going from object space to world space. > > So the final outcome is, if you want to transform a direction vector from > one space to another, just multiply it by the transpose of the > transformation matrix. Now if I could remember enough math to know why. :) > > - Jason Zisk > - nFusion Interactive LLC > > > ----- Original Message ----- > From: "Peter Warden" > To: > Sent: Tuesday, October 31, 2000 9:21 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > Jason Zisk wrote: > > > Hey everyone. I'm having a problem transforming a > > > direction vector from one coordinate space to another. I > > > need to transform the direction of the ray from world space > > > to object space so I can do an intersection on the triangles > > > in a mesh. > > > > > > I've tried two things. The first was taking two points on > > > the line that the > > > direction vector forms, transforming those by the inverse > > > matrix of the > > > object I'm trying to intersect with then recreating the > > > vector from those > > > two transformed points. That has problems if you have > > > scaling in the matrix > > > though, the direction of the vector could change. So this > > > solution is no > > > good in my situation. > > > > Surely in this case you _want_ the direction to change if > > there's scaling. As a thought experiment in 2D, imagine you had > > a square centred on the origin with corners at (-1,-1),(1,-1), > > (1,1) and (-1,1). Now apply a local-to-world transform to take > > this shape into world-space, apply a scaling of x=x*2. This > > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > > a ray into world space that starts at (0,2), and has a direction > > of (-2,-1). This ray will touch the (-2,1) corner of the square > > in world-space. The inverse of the local-to-world transform for > > the square is x=x/2, and if we apply this to both the ray's > > origin and to its direction vector, we end up with a ray at > > (0,2) with a direction of (-1,-1) in the square's local space. > > This ray still kisses the same corner of the square, in local > > space coordinates (-1,1). If the direction _hadn't_ been > > affected by the scaling, the ray in local space would miss the > > square, which isn't what you want! > > > > I'd say transforming the two points by the inverse matrix was > > the right way to tackle this, the alteration of the direction by > > scaling is needed in this case. > > > > > I looked back at the archives of this list and I noticed a > > > discussion of > > > transforming normals. It seems that using the transpose of > > > the inverted > > > transformation matrix is the right way to transform a normal. > > > By doing this > > > I actually solved some problems (with scaling) but caused others with > > > rotation. > > > > Surface normals are a different case, but I can't come up with > > an explanation as to why that I'm happy with! The closest I've > > come is that the normal is part of a plane definition, and so > > need the rules of plane transformation applied to it, whereas a > > ray's direction vector isn't and doesn't. Implicit versus > > explicit representations? An authorative answer from a maths bod > > would be nice... > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > Haines at > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > , and the response from David Rogers a few issues later? > > > > Peter Warden > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 RE: [Algorithms] Correct Way to transform direction vector From: Zhang Zhong Shan - 2000-11-02 02:02:09 ```are you using the transpose of world-local or the transpose of local-world? i think its the later that you are using, the local-world is the revsers of world-local, so you are accturely applying the rule without noticing. this cant be wrong, I can bet my left hand on this. ^_^ zhongshan -----Original Message----- From: Jason Zisk [mailto:ziskj@...] Sent: Thursday, November 02, 2000 1:26 AM To: gdalgorithms-list@... Subject: Re: [Algorithms] Correct Way to transform direction vector This doesn't work, at least it didn't for me. I had to use the transpose of the matrix that transforms the vertex, no inverse at all. If I use the inverse transpose (or transpose inverse) of the world-local matrix it would do position and scaling correctly but the rotation would be all wrong. By not inverting the matrix all would work. I'm working with full 4x4 transformation matrices, the same ones you'd send to D3D or OGL to tell it what to render. - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Zhang Zhong Shan" To: Sent: Wednesday, November 01, 2000 3:05 AM Subject: RE: [Algorithms] Correct Way to transform direction vector > hi jason, > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > THE MATRIX THAT TRANSFORMS VERTEX. > > In your case, you should use the transpose of inversed world-local matrix, > which is exactly the transpose of your local-world matrix. > > btw, getting normals by first transforming the 2 vertices does not work when > there is reflection in transformation. > > > > > > -----Original Message----- > From: Jason Zisk [mailto:ziskj@...] > Sent: Wednesday, November 01, 2000 3:41 AM > To: Algorithms List > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > Well I know for sure that transforming 2 points doesn't work if the matrix > contains scaling. I'm not entirely sure why, I thought I had it figured out > but you are right, the change in direction is what you want. All I know is > that when I do this, if the matrix has scaling its pointing in the wrong > direction. > > I found out that to go from world space to object space you don't want to > multiply the dir. vector by transpose(inverse), you just want to use just > transpose(matrix). > > This makes sense since I believe transforming normals was in reference to > going from object space to world space. > > So the final outcome is, if you want to transform a direction vector from > one space to another, just multiply it by the transpose of the > transformation matrix. Now if I could remember enough math to know why. :) > > - Jason Zisk > - nFusion Interactive LLC > > > ----- Original Message ----- > From: "Peter Warden" > To: > Sent: Tuesday, October 31, 2000 9:21 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > Jason Zisk wrote: > > > Hey everyone. I'm having a problem transforming a > > > direction vector from one coordinate space to another. I > > > need to transform the direction of the ray from world space > > > to object space so I can do an intersection on the triangles > > > in a mesh. > > > > > > I've tried two things. The first was taking two points on > > > the line that the > > > direction vector forms, transforming those by the inverse > > > matrix of the > > > object I'm trying to intersect with then recreating the > > > vector from those > > > two transformed points. That has problems if you have > > > scaling in the matrix > > > though, the direction of the vector could change. So this > > > solution is no > > > good in my situation. > > > > Surely in this case you _want_ the direction to change if > > there's scaling. As a thought experiment in 2D, imagine you had > > a square centred on the origin with corners at (-1,-1),(1,-1), > > (1,1) and (-1,1). Now apply a local-to-world transform to take > > this shape into world-space, apply a scaling of x=x*2. This > > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > > a ray into world space that starts at (0,2), and has a direction > > of (-2,-1). This ray will touch the (-2,1) corner of the square > > in world-space. The inverse of the local-to-world transform for > > the square is x=x/2, and if we apply this to both the ray's > > origin and to its direction vector, we end up with a ray at > > (0,2) with a direction of (-1,-1) in the square's local space. > > This ray still kisses the same corner of the square, in local > > space coordinates (-1,1). If the direction _hadn't_ been > > affected by the scaling, the ray in local space would miss the > > square, which isn't what you want! > > > > I'd say transforming the two points by the inverse matrix was > > the right way to tackle this, the alteration of the direction by > > scaling is needed in this case. > > > > > I looked back at the archives of this list and I noticed a > > > discussion of > > > transforming normals. It seems that using the transpose of > > > the inverted > > > transformation matrix is the right way to transform a normal. > > > By doing this > > > I actually solved some problems (with scaling) but caused others with > > > rotation. > > > > Surface normals are a different case, but I can't come up with > > an explanation as to why that I'm happy with! The closest I've > > come is that the normal is part of a plane definition, and so > > need the rules of plane transformation applied to it, whereas a > > ray's direction vector isn't and doesn't. Implicit versus > > explicit representations? An authorative answer from a maths bod > > would be nice... > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > Haines at > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > , and the response from David Rogers a few issues later? > > > > Peter Warden > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list _______________________________________________ GDAlgorithms-list mailing list GDAlgorithms-list@... http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 Re: [Algorithms] Correct Way to transform direction vector From: Jason Zisk - 2000-11-02 19:05:52 ```To be absolutely clear. :) I use the same 4x4 matrix I send D3D, the "world matrix". I would call this the local to world matrix. Transpose that, multiply my direction vector by it and it works great. Terminology sucks. :) - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Zhang Zhong Shan" To: Sent: Wednesday, November 01, 2000 9:03 PM Subject: RE: [Algorithms] Correct Way to transform direction vector > are you using the transpose of world-local or the transpose of local-world? > > i think its the later that you are using, the local-world is the revsers of > world-local, so you are accturely applying the rule without noticing. > > this cant be wrong, I can bet my left hand on this. ^_^ > > > zhongshan > > > -----Original Message----- > From: Jason Zisk [mailto:ziskj@...] > Sent: Thursday, November 02, 2000 1:26 AM > To: gdalgorithms-list@... > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > This doesn't work, at least it didn't for me. I had to use the transpose of > the matrix that transforms the vertex, no inverse at all. > > If I use the inverse transpose (or transpose inverse) of the world-local > matrix it would do position and scaling correctly but the rotation would be > all wrong. By not inverting the matrix all would work. > > I'm working with full 4x4 transformation matrices, the same ones you'd send > to D3D or OGL to tell it what to render. > > - Jason Zisk > - nFusion Interactive LLC > > > ----- Original Message ----- > From: "Zhang Zhong Shan" > To: > Sent: Wednesday, November 01, 2000 3:05 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > hi jason, > > > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > > THE MATRIX THAT TRANSFORMS VERTEX. > > > > In your case, you should use the transpose of inversed world-local matrix, > > which is exactly the transpose of your local-world matrix. > > > > btw, getting normals by first transforming the 2 vertices does not work > when > > there is reflection in transformation. > > > > > > > > > > > > -----Original Message----- > > From: Jason Zisk [mailto:ziskj@...] > > Sent: Wednesday, November 01, 2000 3:41 AM > > To: Algorithms List > > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > > > > Well I know for sure that transforming 2 points doesn't work if the matrix > > contains scaling. I'm not entirely sure why, I thought I had it figured > out > > but you are right, the change in direction is what you want. All I know > is > > that when I do this, if the matrix has scaling its pointing in the wrong > > direction. > > > > I found out that to go from world space to object space you don't want to > > multiply the dir. vector by transpose(inverse), you just want to use just > > transpose(matrix). > > > > This makes sense since I believe transforming normals was in reference to > > going from object space to world space. > > > > So the final outcome is, if you want to transform a direction vector from > > one space to another, just multiply it by the transpose of the > > transformation matrix. Now if I could remember enough math to know why. > :) > > > > - Jason Zisk > > - nFusion Interactive LLC > > > > > > ----- Original Message ----- > > From: "Peter Warden" > > To: > > Sent: Tuesday, October 31, 2000 9:21 AM > > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > > > > Jason Zisk wrote: > > > > Hey everyone. I'm having a problem transforming a > > > > direction vector from one coordinate space to another. I > > > > need to transform the direction of the ray from world space > > > > to object space so I can do an intersection on the triangles > > > > in a mesh. > > > > > > > > I've tried two things. The first was taking two points on > > > > the line that the > > > > direction vector forms, transforming those by the inverse > > > > matrix of the > > > > object I'm trying to intersect with then recreating the > > > > vector from those > > > > two transformed points. That has problems if you have > > > > scaling in the matrix > > > > though, the direction of the vector could change. So this > > > > solution is no > > > > good in my situation. > > > > > > Surely in this case you _want_ the direction to change if > > > there's scaling. As a thought experiment in 2D, imagine you had > > > a square centred on the origin with corners at (-1,-1),(1,-1), > > > (1,1) and (-1,1). Now apply a local-to-world transform to take > > > this shape into world-space, apply a scaling of x=x*2. This > > > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > > > a ray into world space that starts at (0,2), and has a direction > > > of (-2,-1). This ray will touch the (-2,1) corner of the square > > > in world-space. The inverse of the local-to-world transform for > > > the square is x=x/2, and if we apply this to both the ray's > > > origin and to its direction vector, we end up with a ray at > > > (0,2) with a direction of (-1,-1) in the square's local space. > > > This ray still kisses the same corner of the square, in local > > > space coordinates (-1,1). If the direction _hadn't_ been > > > affected by the scaling, the ray in local space would miss the > > > square, which isn't what you want! > > > > > > I'd say transforming the two points by the inverse matrix was > > > the right way to tackle this, the alteration of the direction by > > > scaling is needed in this case. > > > > > > > I looked back at the archives of this list and I noticed a > > > > discussion of > > > > transforming normals. It seems that using the transpose of > > > > the inverted > > > > transformation matrix is the right way to transform a normal. > > > > By doing this > > > > I actually solved some problems (with scaling) but caused others with > > > > rotation. > > > > > > Surface normals are a different case, but I can't come up with > > > an explanation as to why that I'm happy with! The closest I've > > > come is that the normal is part of a plane definition, and so > > > need the rules of plane transformation applied to it, whereas a > > > ray's direction vector isn't and doesn't. Implicit versus > > > explicit representations? An authorative answer from a maths bod > > > would be nice... > > > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > > Haines at > > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > > , and the response from David Rogers a few issues later? > > > > > > Peter Warden > > > _______________________________________________ > > > GDAlgorithms-list mailing list > > > GDAlgorithms-list@... > > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > > > > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 RE: [Algorithms] Correct Way to transform direction vector From: Zhang Zhong Shan - 2000-11-03 03:02:32 ```I will try to be absolutely clear too, :-) your are right in using the transpose of locale2world matrix, because its the inverse transpose of world2local matrix. there is nothing wrong with the inverse transpose rule. Maybe my english sucks. :-) zhongshan -----Original Message----- From: Jason Zisk [mailto:ziskj@...] Sent: Friday, November 03, 2000 3:07 AM To: gdalgorithms-list@... Subject: Re: [Algorithms] Correct Way to transform direction vector To be absolutely clear. :) I use the same 4x4 matrix I send D3D, the "world matrix". I would call this the local to world matrix. Transpose that, multiply my direction vector by it and it works great. Terminology sucks. :) - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Zhang Zhong Shan" To: Sent: Wednesday, November 01, 2000 9:03 PM Subject: RE: [Algorithms] Correct Way to transform direction vector > are you using the transpose of world-local or the transpose of local-world? > > i think its the later that you are using, the local-world is the revsers of > world-local, so you are accturely applying the rule without noticing. > > this cant be wrong, I can bet my left hand on this. ^_^ > > > zhongshan > > > -----Original Message----- > From: Jason Zisk [mailto:ziskj@...] > Sent: Thursday, November 02, 2000 1:26 AM > To: gdalgorithms-list@... > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > This doesn't work, at least it didn't for me. I had to use the transpose of > the matrix that transforms the vertex, no inverse at all. > > If I use the inverse transpose (or transpose inverse) of the world-local > matrix it would do position and scaling correctly but the rotation would be > all wrong. By not inverting the matrix all would work. > > I'm working with full 4x4 transformation matrices, the same ones you'd send > to D3D or OGL to tell it what to render. > > - Jason Zisk > - nFusion Interactive LLC > > > ----- Original Message ----- > From: "Zhang Zhong Shan" > To: > Sent: Wednesday, November 01, 2000 3:05 AM > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > hi jason, > > > > The rule is: To transform direction vectors, use the INVERSE TRANSPOSE OF > > THE MATRIX THAT TRANSFORMS VERTEX. > > > > In your case, you should use the transpose of inversed world-local matrix, > > which is exactly the transpose of your local-world matrix. > > > > btw, getting normals by first transforming the 2 vertices does not work > when > > there is reflection in transformation. > > > > > > > > > > > > -----Original Message----- > > From: Jason Zisk [mailto:ziskj@...] > > Sent: Wednesday, November 01, 2000 3:41 AM > > To: Algorithms List > > Subject: Re: [Algorithms] Correct Way to transform direction vector > > > > > > Well I know for sure that transforming 2 points doesn't work if the matrix > > contains scaling. I'm not entirely sure why, I thought I had it figured > out > > but you are right, the change in direction is what you want. All I know > is > > that when I do this, if the matrix has scaling its pointing in the wrong > > direction. > > > > I found out that to go from world space to object space you don't want to > > multiply the dir. vector by transpose(inverse), you just want to use just > > transpose(matrix). > > > > This makes sense since I believe transforming normals was in reference to > > going from object space to world space. > > > > So the final outcome is, if you want to transform a direction vector from > > one space to another, just multiply it by the transpose of the > > transformation matrix. Now if I could remember enough math to know why. > :) > > > > - Jason Zisk > > - nFusion Interactive LLC > > > > > > ----- Original Message ----- > > From: "Peter Warden" > > To: > > Sent: Tuesday, October 31, 2000 9:21 AM > > Subject: RE: [Algorithms] Correct Way to transform direction vector > > > > > > > Jason Zisk wrote: > > > > Hey everyone. I'm having a problem transforming a > > > > direction vector from one coordinate space to another. I > > > > need to transform the direction of the ray from world space > > > > to object space so I can do an intersection on the triangles > > > > in a mesh. > > > > > > > > I've tried two things. The first was taking two points on > > > > the line that the > > > > direction vector forms, transforming those by the inverse > > > > matrix of the > > > > object I'm trying to intersect with then recreating the > > > > vector from those > > > > two transformed points. That has problems if you have > > > > scaling in the matrix > > > > though, the direction of the vector could change. So this > > > > solution is no > > > > good in my situation. > > > > > > Surely in this case you _want_ the direction to change if > > > there's scaling. As a thought experiment in 2D, imagine you had > > > a square centred on the origin with corners at (-1,-1),(1,-1), > > > (1,1) and (-1,1). Now apply a local-to-world transform to take > > > this shape into world-space, apply a scaling of x=x*2. This > > > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > > > a ray into world space that starts at (0,2), and has a direction > > > of (-2,-1). This ray will touch the (-2,1) corner of the square > > > in world-space. The inverse of the local-to-world transform for > > > the square is x=x/2, and if we apply this to both the ray's > > > origin and to its direction vector, we end up with a ray at > > > (0,2) with a direction of (-1,-1) in the square's local space. > > > This ray still kisses the same corner of the square, in local > > > space coordinates (-1,1). If the direction _hadn't_ been > > > affected by the scaling, the ray in local space would miss the > > > square, which isn't what you want! > > > > > > I'd say transforming the two points by the inverse matrix was > > > the right way to tackle this, the alteration of the direction by > > > scaling is needed in this case. > > > > > > > I looked back at the archives of this list and I noticed a > > > > discussion of > > > > transforming normals. It seems that using the transpose of > > > > the inverted > > > > transformation matrix is the right way to transform a normal. > > > > By doing this > > > > I actually solved some problems (with scaling) but caused others with > > > > rotation. > > > > > > Surface normals are a different case, but I can't come up with > > > an explanation as to why that I'm happy with! The closest I've > > > come is that the normal is part of a plane definition, and so > > > need the rules of plane transformation applied to it, whereas a > > > ray's direction vector isn't and doesn't. Implicit versus > > > explicit representations? An authorative answer from a maths bod > > > would be nice... > > > > > > I take it you've seen the 'Abnormal Normals' article by Eric > > > Haines at > > > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > > > , and the response from David Rogers a few issues later? > > > > > > Peter Warden > > > _______________________________________________ > > > GDAlgorithms-list mailing list > > > GDAlgorithms-list@... > > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > > > > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > _______________________________________________ > > GDAlgorithms-list mailing list > > GDAlgorithms-list@... > > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list _______________________________________________ GDAlgorithms-list mailing list GDAlgorithms-list@... http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 Re: [Algorithms] Correct Way to transform direction vector From: Jason Zisk - 2000-10-31 19:39:43 ```Well I know for sure that transforming 2 points doesn't work if the matrix contains scaling. I'm not entirely sure why, I thought I had it figured out but you are right, the change in direction is what you want. All I know is that when I do this, if the matrix has scaling its pointing in the wrong direction. I found out that to go from world space to object space you don't want to multiply the dir. vector by transpose(inverse), you just want to use just transpose(matrix). This makes sense since I believe transforming normals was in reference to going from object space to world space. So the final outcome is, if you want to transform a direction vector from one space to another, just multiply it by the transpose of the transformation matrix. Now if I could remember enough math to know why. :) - Jason Zisk - nFusion Interactive LLC ----- Original Message ----- From: "Peter Warden" To: Sent: Tuesday, October 31, 2000 9:21 AM Subject: RE: [Algorithms] Correct Way to transform direction vector > Jason Zisk wrote: > > Hey everyone. I'm having a problem transforming a > > direction vector from one coordinate space to another. I > > need to transform the direction of the ray from world space > > to object space so I can do an intersection on the triangles > > in a mesh. > > > > I've tried two things. The first was taking two points on > > the line that the > > direction vector forms, transforming those by the inverse > > matrix of the > > object I'm trying to intersect with then recreating the > > vector from those > > two transformed points. That has problems if you have > > scaling in the matrix > > though, the direction of the vector could change. So this > > solution is no > > good in my situation. > > Surely in this case you _want_ the direction to change if > there's scaling. As a thought experiment in 2D, imagine you had > a square centred on the origin with corners at (-1,-1),(1,-1), > (1,1) and (-1,1). Now apply a local-to-world transform to take > this shape into world-space, apply a scaling of x=x*2. This > leaves the corners at (-2,-1),(2,-1),(2,1) and (-2,1). Now, put > a ray into world space that starts at (0,2), and has a direction > of (-2,-1). This ray will touch the (-2,1) corner of the square > in world-space. The inverse of the local-to-world transform for > the square is x=x/2, and if we apply this to both the ray's > origin and to its direction vector, we end up with a ray at > (0,2) with a direction of (-1,-1) in the square's local space. > This ray still kisses the same corner of the square, in local > space coordinates (-1,1). If the direction _hadn't_ been > affected by the scaling, the ray in local space would miss the > square, which isn't what you want! > > I'd say transforming the two points by the inverse matrix was > the right way to tackle this, the alteration of the direction by > scaling is needed in this case. > > > I looked back at the archives of this list and I noticed a > > discussion of > > transforming normals. It seems that using the transpose of > > the inverted > > transformation matrix is the right way to transform a normal. > > By doing this > > I actually solved some problems (with scaling) but caused others with > > rotation. > > Surface normals are a different case, but I can't come up with > an explanation as to why that I'm happy with! The closest I've > come is that the normal is part of a plane definition, and so > need the rules of plane transformation applied to it, whereas a > ray's direction vector isn't and doesn't. Implicit versus > explicit representations? An authorative answer from a maths bod > would be nice... > > I take it you've seen the 'Abnormal Normals' article by Eric > Haines at > http://www.acm.org/tog/resources/RTNews/html/rtnews1a.html > , and the response from David Rogers a few issues later? > > Peter Warden > _______________________________________________ > GDAlgorithms-list mailing list > GDAlgorithms-list@... > http://lists.sourceforge.net/mailman/listinfo/gdalgorithms-list ```
 Re: [Algorithms] Correct Way to transform direction vector From: Jim Offerman - 2000-10-31 23:49:53 ```Shouldn't this also work: Vector3 dirWorld3 = /* three component direction vector of the ray in world space */ Vector4 dirWorld4 = { dir3.x, dir3.y, dir3.z, 0 } Vector4 dirLocal4 = dirWorld4 * worldToLocal; (Where worldToLocal is the inverse of the localToWorld matrix, or of localToWorld is orthogonal, the transpose of localToWorld, since it will be equal to the inverse in that case). The idea being that since you are transforming a direction vector and not a point, it should not be affected by any translations that might be present in the localToWorld matrix. Performance wise, you are probably better of grabbing the 3x3 orientation (and scale) submatrix and using that with the 3 component version of the direction vector. Jim Offerman Innovade The Atlantis Project http://www.theatlantisproject.com ```
 Re: [Algorithms] Correct Way to transform direction vector From: Jason Zisk - 2000-11-01 03:14:45 ```> Shouldn't this also work: > > Vector3 dirWorld3 = /* three component direction vector of the ray in world > space */ > Vector4 dirWorld4 = { dir3.x, dir3.y, dir3.z, 0 } > > Vector4 dirLocal4 = dirWorld4 * worldToLocal; > > (Where worldToLocal is the inverse of the localToWorld matrix, or of > localToWorld is orthogonal, the transpose of localToWorld, since it will be > equal to the inverse in that case). Yes, if you use the transpose of localToWorld. Just using the inverse doesn't seem to work, most likely because of the translation (like you said, just using the inverse 3x3 will solve this). Transpose is a cheaper operation than inverting and allows the use of the full 4x4 transformation matrix without having to grab out the 3x3 part, so I'd think transpose of localToWorld would be the best choice here. - Jason Zisk - nFusion Interactive LLC ```