You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not getting why the column-matrix storing the linear combination coefficients is noted as Tvj, since we are not expressing the basis vectors of W in the formula.
Also in the lecture is said that "what we get is the matrix representation of Tu" (probably meant Tv), but again: if it were so, then shouldn't it be: T1,jw1 + ... + Tm,jwm ?
Thanks in advance for the help!
The text was updated successfully, but these errors were encountered:
I went through the slides and the video recording to double-check, let me now dispel your doubts:
The column-matrix with the numbers T_1,j ... T_m,j is by definition a representation with respect to the basis w_1 ... w_m. Recall that every time we write numbers in a matrix, we do it after fixing a basis in which those numbers make sense. Since the right hand side of the equation expresses a vector in the target vector space W, the basis that underlies any matrix representation on the right hand side is the basis w_1 ... w_m.
Why do we call it Tvj? Because that's precisely what it encodes. Do the following thought experiment: in the matrix equation of slide 71/72, replace the c vector with an indicator vector (0 0 .. 1 .. 0), where 1 is in position j. The result of the matrix-vector product in this case will yield precisely the j-th column of T, which is Tvj expressed in the basis w_1 ... w_m. If this works for you, then you can replace the indicator vector with a weighted indicator vector (0 0 .. c_j .. 0), where c_j is a scalar value in position j. Then, the matrix-vector product yields the j-th column of T, weighted by the scalar c_j. If this also works for you, then you can replace the weighted indicator vector with an entire vector of weights (c_1 c_2 .. c_j .. c_n); the matrix-vector product in this case will be the weighted sum of the columns of T, which is what is written in the slide.
"what we get is the matrix representation of Tu": indeed, I meant Tv instead of Tu.
I hope this clarifies. If not, feel free to cast your doubts here!
Addendum: this should also clarify our previous statement that matrix representations only need to know how to map basis vectors Tvj, since for arbitrary vectors c we only need to take a linear combination of the Tvj with the coefficients contained in c.
Slides "03-linalg", 71/72
I'm not getting why the column-matrix storing the linear combination coefficients is noted as Tvj, since we are not expressing the basis vectors of W in the formula.
Also in the lecture is said that "what we get is the matrix representation of Tu" (probably meant Tv), but again: if it were so, then shouldn't it be: T1,jw1 + ... + Tm,jwm ?
Thanks in advance for the help!
The text was updated successfully, but these errors were encountered: