Aktualny czas:0:00Całkowity czas trwania:4:03
0 punktów energii
Uczysz się do testu? Skorzystaj z tych 13 lekcji na temat Pochodne funkcji wielu zmiennych.
Zobacz 13 lekcji
Transkrypcja filmu video (w języku angielskim)
- [Voiceover] So let's say you have yourselves some kind of multivariable function, and this time its got some very high dimensional inputs. So x1, x2 on and on and on up to sub x sub n, for some large number n. In the last couple videos I told you about the Laplacian operator, which is a way of taking in your scalar valued function f and it gives you a new scalar valued function that's kind of like a second derivative thing because it takes the divergence of the gradient of your function f. So the gradient of f gives you a vector field and the divergence of that gives you a scalar field. And what I want to show you here is another formula that you might commonly see for this Laplacian. So first let's kind of abstractly write out what the gradient of f will look like. So we start by taking this del operator, which is going to be a vector full of partial differential operators. Partial with respect to x1, partial with respect to x2, and kind of on and on and on up to partial with respect to that last input variable. And then you kind of just imagine multiplying it by your function, so what you end up getting is all the different partial derivatives of f. It's partial of f with respect to the first variable, and then kind of on and on and on up until you get the partial derivative of f with respect to that last variable, x sub n. And the divergence of that and just to save myself some writing I'm gonna say you take that nabla operator, and then you imagine taking the dot product between that whole operator and this gradient vector that you have here, what you end up getting is well, you start by multiplying the first components which involves taking the partial derivative with respect to x1, that first variable of the partial derivative of f with respect to that same variable. So it looks like the second partial derivative of f with respect to that first variable. So the second partial derivative of f with respect to x1, that first variable. Then you imagine kind of adding what the product of these next two items will be and for very similar reasons that's gonna look like the second partial derivative of f with respect to that second variable, partial x2 squared. And you do this to all of them and you're adding them all up until you find yourself doing it to the last one. So you've got plus and a whole bunch things and you'll be taking the second partial derivative of f with respect to that last variable, partial of x sub n. This is another format in which you might see the Laplacian, and often times it's written kind of compactly, so people will say the Laplacian of your function f, is equal to, using sigma notation, you'd say the sum from i is equal to 1 up to, you know, 1, 2, 3 up to n. So the sum from that up to n, of your second partial derivatives. Partial squared of f with that i-th variable. So if you were thinking in terms of three variables often x1, x2, x3 we instead write x, y, z, but is common to more generally just say x sub i. So this here is kind of the alternate formula that you might see for the Laplacian. Personally I always like to think about it as taking the divergence of the gradient of f, because you're thinking about the gradient field, and the divergence of that kind of corresponds to maxima and minima of your original function, which is what I talked about in the initial intuition of Laplacian video. But this formula is probably a little more straightforward when it comes to actual computations and, oh wait sorry I forgot a square there, didn't I? Partial x squared, so this is the second derivative. So summing all these second partial derivatives. And you can probably see this a kind of a more straightforward way to compute a given example that you might come across, and it also makes it clearer how the Laplacian is kind of an extension of the idea of a second derivative. See you next video.