Re: [Algorithms] decompose onto non-orthogonal vectors
Brought to you by:
vexxed72
From: <ro...@do...> - 2000-07-15 03:24:24
|
Discoe, Ben wrote: > >I thought it would be a simple problem, but all usual sources failed to >answer, so perhaps it will be obvious to someone on this list. > >Given a point p and two unit vectors a and b like this: > > b > / > v/----p > / / > / / > / / >---------a > u > >how to you get scalars (u,v) such that u*a + v*b = p? >Ie. decompose p onto a and b. > >I promise i consulted an academic, a linear algebra textbook and the WWW >(all failed) before resorting the list :) > Sheesh, this is a problem that you ought to be able to solve after doing Chapter 1 in any linear algebra book. In fact, under the most natural interpretation of what you have stated, this is a problem in intermediate (high school) algebra. First, this is essentially a 2D problem because it has no solution unless p lies in the plane spanned by a and b. So I suppose that it is posed as a 2D problem, then consider what to do if you are dealing with it posed as a 3 or higher dimensional problem So if you are "given" a, b and p, then you must be given their components with respect to SOME coordinate system (and it doesn't even have to be an orthonormal coordinate system), So we suppose you are given a = (a1, a2), b=(b1, b2) and p = (p1, p2), where a1, a2, b1, b2, p1, p2 are known scalars. (If you are not given this info, but just a diagram, then we are reduced to a high school geometry compass and straight edge solution, which also is fun, but I'll forbear) Now when you write out your vector equation ua + vb = p in components it is actually a system of two linear equations u a1 + v b1 = p1 u a2 + v b2 = p2 (Note that I refuse to use * for multiplication in algebra, even in ASCII algebra) That is A SYSTEM OF TWO LINEAR EQUATIONS IN TWO UNKNOWNS u, v, which you ought to have learned to solve in intermediate algebra, probably the simplest of all problems in the subject that we call "linear algebra", no? Gaussian elimination, the general means of attacking all linear systems, works fine for this, but most people would have remembered the Cramer's rule solution in terms of determinants. u = (p1 b2 - p2 b1)/(a1 b2 - a2 b1) v = (a1 p2 - a2 p1)/(a1 b2 - a2 b1) Voilà la solution| (Where I use the French in honor of July 14). If the denominator (a1 b2 - a2 b1) is zero then it means that a and b are linearly dependent (i.e. collinear) and either it has no solution (if p is not also collinear with a and b) or infinitely many different solutions (if p is collinear with a and b) Sheesh. What kind of "academic" did you consult, a cooking teacher? Now suppose that you are given this as a 3D problem, say a = (a1, a2, a3) and b = (b1, b2, b3) Then you have u a1 + v b1 = p1 u a2 + v b2 = p2 u a3 + v b3 = p3 a system of THREE linear equations in TWO unknowns. It may or may not have solutions, in fact, as stated above, it will have a solution only if p lies in the plane spanned by a and b, which requires that the three equations be linearly dependent. Cramer's rule does not apply here, but Gaussian elimination still does. As you apply your favorite Gaussian elimination algorithm, one of two things will happen: either (1) you will get one of the equations into the form 0 = 0 , in which case it is irrelevant and you are left with a system of two equations in two unknowns, or (2) you will get one of the equations into the form 0 = 1, in which case there is no solution, i.e. p does NOT lie in the plane spanned by a and b No, I will not belabor the list with a review of Gaussian elimination algorithms. You will find one in Chapter 1 of your favorite linear algebra book. Sheesh. You weren't trolling me were you? |