If you're seeing this message, it means we're having trouble loading external resources on our website.

Jeżeli jesteś za filtrem sieci web, prosimy, upewnij się, że domeny ***.kastatic.org** i ***.kasandbox.org** są odblokowane.

Główna zawartość

Aktualny czas:0:00Całkowity czas trwania:12:06

Say I have a vector v that's
a member of Rn. So it's got n components
in it. So v1, v2, all the
way down to vn. I've touched on the idea before,
but now that we've seen what a transpose is, and
we've taken transposes of matrices, there's no reason
why we can't take the transpose of a vector, or a
column vector in this case. So what would v transpose
look like? Well if you think of this as a n
by 1 matrix, which it is, it has n rows and one column. Then what are we going to get? We're going to have a 1 by n
matrix when you take the transpose of it. And this one column is going
to turn into the one row. So you're going to have
it be equal to v1, v2, all the way to vn. And you might remember, we've
already touched on this in a lot of matrices before. Let's say that's
some matrix A. We called the row vectors of
those matrix, we called them the transpose of some column
vectors, a1 transpose, a2 transpose, all the way
down to an transpose. In fact, not so many videos ago
I had those row vectors, and I could have just called
them the transpose of column vectors, just like that. And that would have been, in
some ways, a better way to do it because we've defined all
these operations around column vectors, so you could always
refer to the transpose of the transpose and then do some
operations on them. But anyway, I don't want
to get too diverted. But let's think a little bit
of what happens when you operate this vector, or you take
some operation of this vector with some
other vectors. So let's say I have another
vector here that's w, and it's also a member of Rn. So you have w1, w2, all
the way down to wn. There's a couple of things that
we're already, I think, reasonably familiar with. You could take the dot
product of v and w. v dot w is equal to what? It is equal to v1 times w1, plus
v2, w2, and you just keep going all the way to vn, wn. This is the definition
of the dot product of two-column vectors. Now, how can we relate that to
maybe the transpose of v? Well, we could take the
transpose of v-- let me write it this way-- what is-- if I did
a matrix multiplication, so I did v1, v2, all the way
to vn-- so this is v transpose, that's v transpose--
and I take the product of that with w. So I have w1, w2, all
the way down to wn. Now, if I view these as just
matrices-- this is w right here-- if I viewed these just
as matrices, is this matrix-matrix product
well-defined? Over here I have a
n by 1 matrix. Here I have a 1-- sorry. Here, the first one I have is
a 1 by n matrix-- I have one row and n columns. And here I have an
n by 1 matrix. I have n rows and
only one column. So this is well-defined. I have the same number
of columns here as I have rows here. This is going to result
in a 1 by 1 matrix. And what's it going
to look like? It's going to equal to v1 times
w1-- let me write it like this-- v1, w1 plus
v2, w2-- it's only going to have one entry. We could write it is as just
a 1 by 1 matrix like that. Let me just do it-- 1 by
1 matrix like that. v1, w1 plus v2, w2-- let me
just, I could write v2 there-- plus all the way to vn, wn. That's what it'll be. It'll just be a 1 by 1 matrix
that looks like that. But you might notice that
these two things are equivalent. So we can make the statement
that v dot w, which is the same thing as w dot v, these
things are equivalent to-- v dot w is the equivalent of-- let
me just write it once over here-- v dot w is the same thing
as the transpose of v, v transpose times w as just
a matrix-matrix product. So if you view v as a matrix,
take its transpose and then just take that matrix and take
the product of that with w, it's the same thing
as v dot w. So that's an interesting
take-away. I guess you could argue somewhat
obvious, and we've already been referring this--
when I defined matrix-matrix products, I kind of said you're
taking the dot product of each row with each column,
and you can see that it really is, it's really the dot product
of the transpose of that row with each column, but
you got the general idea. But let's see if we can build
on this a little bit. Let's say I have some matrix
A-- let me save our little outcome that I have there--
let's say I have some-- let me get a good color here-- let's
say I have some matrix A and it's an m by n matrix. Now if I were to multiply that
times a vector x, so I'm going to multiply it by some vector
x-- and let's say that x is a member-- let me write it this
way-- x is a member of Rn. So it has n elements. Or another way you could view it
is, it's an n by 1 matrix. Now when I take the product of
these, what am I going to get? Or another way to say it is,
what is the vector Ax? When I take this product, I'm
just going to get another vector, and what's
it going to be? It's going to be an
m by 1 vector. So we could say that Ax
is a member of Rm. It's going to have m
elements, right? If this was equal to, if you
said that Ax equal to, I don't know, let's say it's equal to
z, z would have m elements. You would have z1, z2, all
the way down to zm. And I know that because you have
m rows in A, and you have only one-- well you could
say this is m by n, this is n by 1. The resulting product will be
m by 1, or it'll be a vector that is a member of Rm-- it'll
have exactly m elements. Now, if that's a vector of Rm,
then the idea of dotting this with another member of
Rm is well-defined. So let's say that I have
another member of Rm. Let's say I have a vector y. Let's say y is also
a member of Rm. This has-- the vector Ax, the
vector that you get when you take this product, has m
elements, this has m elements. So the idea of taking their dot
product is well-defined. Let me write that. So you could take Ax, that's a
vector, and now we are dotting it with this vector right here
and we'll get a number. We just take each of their
terms, multiply the corresponding terms, add
them all up, and you get their dot product. But what is this equal to? We can just use this little,
I guess you could call it a result, that we got earlier
on in this video. Using this result, the dot
product of two matrices-- or sorry, the dot product of two
vectors is equal to the transpose of the first vector
as a kind of a matrix. So you can view this
as Ax transpose. This is a m by 1,
this is m by 1. Now this is now a 1 by m
matrix, and now we can multiply 1 by m matrix
times y. Just like that. Now what is this
thing equal to? We saw a while ago, I think it
was two or three videos ago, we saw that if we take the
product of two matrices and take its transpose, that's equal
to the reverse product of the transposes. You switch the order and then
take the transposes. So this is going to be equal
to-- this purple part-- is going to be equal to
x transpose times A transpose times y. And this is just matrix
products. These are matrix products. These aren't necessarily
vector operations. We're treating all of these
vectors as matrices. And of course, we're treating
the matrix as a matrix. So what is this equal to? Well we know that matrix
products are associative. You could put a parentheses--
right now we have a parentheses around there from
there, but we could just take another association. We could say that that is equal
to x transpose times these two matrices
times each other. This is a vector, but
you can represent it as an m by 1 matrix. Times A transpose y. Just like that. Now let's think about what
A transpose y is. Let's think about it. A transpose-- we have
here a is m by n. What is A transpose? A transpose is going to
be n by m, right? It's going to be an m by n. So this is an m by n. And then what is this vector
y going to be? This is an m by 1. So when you take this product,
you're going to get an n by 1 matrix. Or you could imagine this as a
vector that is a member of Rn. So this is a member of Rn. The entire product is going to
result with a vector that's a member of Rn. And of course, it's well-defined
because this is a 1 by n vector right there. Now we can go back
to our identity. We have the transpose of some
vector times some other vector-- they have the same,
well I guess you could say this has as many horizontal
entries as this guy has vertical entries,
just like that. So what is this equal to? We just use that identity. This is equal to, the just
regular x in this case, instead of x transpose
we'll just have x. So this is equal to x dot--
remember, we just un-transpose it, I guess you can view it that
way-- dot A transpose y. Which is a pretty
neat outcome. We got this being
equal to that. We can kind of change the
associativity, although we have to essentially change the
order a bit and take the transpose of our matrix. So let me re-write that
just so that you can remember the outcome. So the two big outcomes of this
video are-- I'll rewrite this one up here-- v dot w is
equal to the matrix product of v transpose times w. And if I have some matrix--
you assume all of these matrix-vector products are
well-defined and all the dot products are well-defined. If I have Ax dot y, some other
vector y, this is equivalent to x dot-- you're essentially
putting the A with the other vector-- A transpose times y. And this just might be a useful
outcome, or a useful result, that we could build
upon later in the linear algebra playlist.