Off Topic: The Flood
This topic has moved here: Subject: Linear Algebra: Linear Transformations and Matrices
  • Subject: Linear Algebra: Linear Transformations and Matrices
Subject: Linear Algebra: Linear Transformations and Matrices

"Two things are infinite, the universe and human stupidity; and I'm not sure about the universe." -- Albert Einstein

Hey, so I can am going to introduce you all to some interesting concepts in Linear Algebra. In particular, I plan on discussing Linear Transformation, and their Matrix Representations and how you can use matrix multiplication to actually compute images of linear transformations!

There are going to be some preliminary definitions that we will have to get out of way first.

Def. A vector space V over a field (for simplicity, just assume the field is the real numbers) is a set where to operations (addition and scalar multiplication) are defined such that for all x,y in V, there is a unique element x + y in V, and for each real number c, cx is a unique element in V. (so V is closed on addition and scalar multiplcation). Also these conditions must hold.

Note: vectors are elements of vector spaces, and scalars are elements of the field (real numbers).

Example:
Take the two-dimensional coordinate system, R^2 (each vector is represented as a two-tuple).

(1,2) is a vector in R^2 and 1,2 are elements of R.

Addition is simple:
(x,y) + (u,v) = (x + u, y + v)

Scalar Multiplication is simple:
c(x,y) = (cx, cy)

You can verify the vector space conditions for yourself if you want.

Def. A subset, S, of a vector space is linearly dependent if there exists a finite number of distinct elements in S (v_1, v_2, ... v_n) , and scalars (a_1, a_2,... a_n), not all zero, such that this holds.

Note: If all a_i must be zero for the equality to hold, then S is linearly independent.

Example:
S = {(1,0),(0,1), (2,3)}

-2(1,0) + -3(0,1) + 1(2,3) = (0,0)

So S is linearly dependent.

If S = {(1,0), (0,1)}, then it would clearly be linearly independent.

Def. A basis, B, (typically denoted by the Greek letter beta), for a vector space V, is a linearly independent subset of V such that each element of V can be written as a linear combination of elements in B.

Example:
B = {(1,0),(0,1)}

We know it is linearly independent and it should be clear that each element in R^2 can be represented by a linear combination of its elements.

Suppose we have (a,b)
But (a,b) = a(1,0) + b(0,1)
So B is a basis for R^2

Def. Let V and W be two vector spaces over R. A linear transformation T between from V to W (denoted T: V -> W) is a function that has the following properties for all x,y in V and c in R

1. T(x + y) = T(x) + T(y)
2. cT(x) = T(cx)

Examples:
Let's make our vector space, V, the set of all polynomials with degree at most 2 (addition and scalar multiplication should be clear).

Define T: V -> V
T(f(x)) = f"(x) + f'(x) + f(x)

So if f(x) = ax^2 + bx + c
T(f(x)) = 2a + 2ax + b + ax^2 + bx + c = ax^2 + (b + 2a)x + (2a + c)1

You can verify the linearity conditions if you want.

Now I am going to define something called a coordinate vector of element with respect to a particular ordered basis. It is basically a column matrix whose ith entry corresponds to the ith coefficient of the linear combination using the vectors in that basis.

Example:
Let our V be the set of polynomials with degree at most 2 and B = {1,x,x^2}
u = 4x^2-3x+1

Then the coordinate vector of u with respect to B is this.

Now I am going to talk about matrices, I am going to assume that everyone here has a basic understanding of what a matrix and how to perform addition, and scalar multiplication on them. I will also define multiplication between two matrices. Note that the set of all mxn matrices with entries from the reals are a vector space.

Def. Let A be a mxn matrix, and B be a nxp (Note: the number of columns of A and the number of rows in B must be equal for matrix multiplication). Then the element in the ith row and jth column is defined as this.

Note AB does not necessarily equal BA.

Example.

Now I am going to define the matrix representation of a function with respect to a particular basis.

Suppose we have two vector spaces, V and W, with ordered bases B and Y respectively. Also suppose we have a linear transformation T: V -> W
The matrix representation of T with respect to B and Y is the matrix where the entries in the ith column correspond to the ith vector in B and the jth entry in that column corresponds to the coefficient of the linear combination of the basis vectors that equal the image of the aforementioned vector in B!

Another way to think about it is that the ith column of it is the coordinate vector of the image of ith vector in B.

Example:
Consider the transformation T that we used in our previous example. Let B = {1, x, x^2} (this is clearly a basis for V. Also note that since the vector space is T-invariant, we don't need to find another basis for W (i.e. T: V -> V)

Let's compute the image of each of vectors in B:

T(1) = 1
T(x) = 1 + x
T(x^2) = 2 + 2x + x^2

Now I am going to write each image as a linear combination of elements in B:

T(1) = 1(1) + 0(x) + 0(x^2)
T(x) = 1(1) + 1(x) + 0(x^2)
T(x^2) = 2(1) + 2(x) + 1(x^2)

So, for instance, the first element of the third column of the matrix representation of T would be 2, the second element would be 2, and the last element would be 1 (corresponding to the coefficients of T(x^2).

Matrix representation of T with respect to B

Yay, now I am going to do our first theorem:

Theorem 1:
Let V and W be vector spaces having ordered bses beta amd gamma, respectively, and let T: V -> W be linear, then for each u in V
This holds.

That is the coordinate vector of the image of u with respect to gamma is equal to the product of the matrix representation of T with respect to beta and gamma multiplied by the coordinate vector of u with respect to beta!

Isn't that cool? You can basically compute images of the elements of u just using matrix multiplication!

Examples:
Let V be the vector space with polynomials of degree at most 3 and W be the vector space of polynomials with degree at most 2. Let B = {1,x,x^2,x^3} be a ordered basis for V, and Y = {1,x,x^2} be a ordered basis for W.

Define T as follows:

T: V -> W
T(f(x)) = f'(x)

It is easily verified that T is linear. Let's compute the matrix representations like we did in the previous example.

T(1) = 0 = 0(1) + 0(x) + 0(x^2)
T(x) = 1 = 1(1) + 0(x) + 0(x^2)
T(x^2) = 2x = 0(1) + 2(x) + 0(x^2)
T(x^3) = 3x^2 = 0(1) + 0(x) + 3(x^2)

Matrix representation of T with respect to B and Y

Now let's verify the theorem for a particular polynomial.

Let's say f(x) = 3x^3 + 2x^2 - x + 2
Clearly f'(x) = 9x^2 + 4x - 1

So the coordinate vector of f(x) with respect to B is this.
And the coordinate vector of T(f(x)) = f'(x) with respect to Y is this.

Okay so let's compute the matrix product of the matrix representation of T with the coordinate vector of f(x)

From that you can work backwards a get f'(x) = 9x^2 + 4x - 1

It works! Isn't that so cool?

  • 12.07.2012 7:00 PM PDT

WALL OF SHAME-Posting stupidity since 2010
__________________
Posted by: Maximus Decimus
Agreed. All the changes are good and add variety. Do we really want another Halo 3? Just running around with one gun and no armor abilities? It's good that 343 wants to try something new. Do we really want the same thing for three more games?

We were just doing this for the past few days in my school.

  • 12.07.2012 7:15 PM PDT

Rain, and Jazz.
Halo: Tactical

I'm either a fool or an inteligent man, depending on how sleepy or angry I am.

I originaly made an account on 07.27.2007 but I wanted to link my GT and made this account. Don't forget your passwords!

Perhaps we could get a group here on how to do various kinds of math, so we could always refer people back to a specific source...?

  • 12.07.2012 7:18 PM PDT
  • gamertag: [none]
  • user homepage:

***Aberrant Designs***

Finished the fight on September 26,2007, 10:49pm EST
Remembered Reach on September 15th, 2010 9:30pm EST

I used to be really good at this.

Now, I haven't done any for nearly 4 years and I remember nothing. Thanks for making me take such a valuable class, College!

  • 12.07.2012 7:19 PM PDT

The human element always mucks things up.

Carry the 2.

  • 12.07.2012 7:19 PM PDT
  • gamertag: Co M4N
  • user homepage:

I saved thread. This could really be helpful in the future.

  • 12.07.2012 7:20 PM PDT
  • gamertag: [none]
  • user homepage:

13.72 billion years in the making.

On December 1st, 2012, I met Neil deGrasse Tyson. I shook the man's hand, and even made him laugh. Not much else to do with my life now.

Posted by: Arbiter 739
Perhaps we could get a group here on how to do various kinds of math, so we could always refer people back to a specific source...?

That's pretty much the Sec7s. A few mathematicians in there.

  • 12.07.2012 7:30 PM PDT