Club PA 2.0 has arrived! If you'd like to access some extra PA content and help support the forums, check it out at patreon.com/ClubPA
The image size limit has been raised to 1mb! Anything larger than that should be linked to. This is a HARD limit, please do not abuse it.
Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

# Linear Algebra Problem

Registered User regular
edited December 2008
Hey ya'll, I was having a little difficulty with this problem for my Linear Algebra class, maybe someone can shed some light on it.

Given W is a subspace of R^n and has an orthogonal basis {u(1) .... u(p)}

Also, W' is its orthogonal compliment (perp W)

show that dim(W) + dim(perpW) = n.

Now, I'm thinking this has to do with the rank theorem where rank(A)+ nul (A) = n

If you were to put the vectors of W into a matrix A, then rank(A) = dim(W).

However I'm having a hard time connecting nul(A) with dim(perpW). Can anybody point me in the right direction? If clarification is needed let me know, this post feels a little disjointed.

Demerdar on

## Posts

• Registered User
edited November 2008
It seems to me as though you've got everything you need right there.

As I understand it, the Rank-Nullity theorem is the big piece you need.

Without spoiling the whole problem for you: I would review what perp(W) in fact means, and then what you can say about the vectors in W wrt perp(W) as a result.

Does that help?

Fuzzywhale on
• Registered User regular
edited November 2008
Doesn't perp(W) contain all the vectors orthogonal to W?

Also, if I were to put W in A, then perp(colA) = Nul(A^t).

Is it true that for orthogonal sets dim(NulA^t) = dim(NulA)?

I think I'm just missing a piece of the puzzle. Maybe you can tell me where I'm going wrong?

Demerdar on
• Registered User
edited November 2008
Demerdar wrote: »

Is it true that for orthogonal sets dim(NulA^t) = dim(NulA)?

Pretty sure this is correct. Here's why: You're right in saying that Perp(W) is the set of all vectors orthogonal to W. So you could if you like calculate this via the following matrix equation:

Ax=0, where A is the matrix made by the basis of W.

so if you were to find all x, you'd find a basis for vectors orthogonal to the vectors in A. The set of all linear combinations makes a nice space called Perp(W)!

As well Ax=0 is just solving for the Kernel of A.

is that more useful?

Anyone else please chime in if ive messed up

Fuzzywhale on
• Registered User regular
edited December 2008
Fuzzywhale wrote: »
Demerdar wrote: »

Is it true that for orthogonal sets dim(NulA^t) = dim(NulA)?

Pretty sure this is correct. Here's why: You're right in saying that Perp(W) is the set of all vectors orthogonal to W. So you could if you like calculate this via the following matrix equation:

Ax=0, where A is the matrix made by the basis of W.

so if you were to find all x, you'd find a basis for vectors orthogonal to the vectors in A. The set of all linear combinations makes a nice space called Perp(W)!

As well Ax=0 is just solving for the Kernel of A.

is that more useful?

Anyone else please chime in if ive messed up

So if you solve Ax=0 with W as your matrix A, you're saying that Nul A forms a basis for perpW? In a theorem i'm reading it says that the orthogonal complement to RowA is NulA. And orthogonal complement of colA is Nul A^t.

Am I not seeing something here?

Demerdar on
• __BANNED USERS
edited December 2008
out of curiosity, which textbook are you using?

anyway: I'd do this by contradiction. Assume that dim(W)+dim(perp(W)) =/=n

this produces 2 cases: when dim(W)+dim(perp(W)) > n or when dim(W)+dim(perp(W)) < n

based on the fact that you have n basis vectors in your original space W you should get a contradiction in either case pretty quick

do you have access to the dimension theorem? If you do there's a pretty slick proof there as well

Little Jim on
• Registered User
edited December 2008
"Nul A forms a basis for perpW"

Yup. I ran this by another student in my office and he agreed, for what its worth. So as I said solving Ax=0 is finding all the vector perpendicular to the vectors that make the matrix A. Since A is made of basis vectors, we are finding a basis for the set of vectors perpendicular to those in A.

So That's a basis for the orthogonal complement, as we are doing our ops with a basis.

As well, Ax=0 is solving for the kernel. So Dim(Ker(A){the Nullity} should =dim(Perp(A)).

So then by the rank nullity theorem Dim(Im(A)) plus Dim(Ker(A)), which is the same as dim(Perp(A)), have to add up to N.

remember the matrix A is the matrix formed by the basis vectors of W, the subspace you started with.

Sound good?

Fuzzywhale on
• Registered User regular
edited December 2008
I understand now because A is an orthogonal basis so solving the null equation will give you the basis for perp W. Thanks ya'll

Demerdar on