Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

linalg - product of "map matrix" and "vector matrix" #7

Open
sh3rlock14 opened this issue Mar 18, 2021 · 1 comment
Open

linalg - product of "map matrix" and "vector matrix" #7

sh3rlock14 opened this issue Mar 18, 2021 · 1 comment

Comments

@sh3rlock14
Copy link

immagine

Slides "03-linalg", 71/72

I'm not getting why the column-matrix storing the linear combination coefficients is noted as Tvj, since we are not expressing the basis vectors of W in the formula.

Also in the lecture is said that "what we get is the matrix representation of Tu" (probably meant Tv), but again: if it were so, then shouldn't it be: T1,jw1 + ... + Tm,jwm ?

Thanks in advance for the help!

@erodola
Copy link
Owner

erodola commented Mar 24, 2021

I went through the slides and the video recording to double-check, let me now dispel your doubts:

  • The column-matrix with the numbers T_1,j ... T_m,j is by definition a representation with respect to the basis w_1 ... w_m. Recall that every time we write numbers in a matrix, we do it after fixing a basis in which those numbers make sense. Since the right hand side of the equation expresses a vector in the target vector space W, the basis that underlies any matrix representation on the right hand side is the basis w_1 ... w_m.

  • Why do we call it Tvj? Because that's precisely what it encodes. Do the following thought experiment: in the matrix equation of slide 71/72, replace the c vector with an indicator vector (0 0 .. 1 .. 0), where 1 is in position j. The result of the matrix-vector product in this case will yield precisely the j-th column of T, which is Tvj expressed in the basis w_1 ... w_m. If this works for you, then you can replace the indicator vector with a weighted indicator vector (0 0 .. c_j .. 0), where c_j is a scalar value in position j. Then, the matrix-vector product yields the j-th column of T, weighted by the scalar c_j. If this also works for you, then you can replace the weighted indicator vector with an entire vector of weights (c_1 c_2 .. c_j .. c_n); the matrix-vector product in this case will be the weighted sum of the columns of T, which is what is written in the slide.

  • "what we get is the matrix representation of Tu": indeed, I meant Tv instead of Tu.

I hope this clarifies. If not, feel free to cast your doubts here!

Addendum: this should also clarify our previous statement that matrix representations only need to know how to map basis vectors Tvj, since for arbitrary vectors c we only need to take a linear combination of the Tvj with the coefficients contained in c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants