This article is rated C-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
It says in the article that in higher dimensions you need to look at the eigenvalues. However, as is clearly explained in the article on positive definite matrix, it is possible to simply evaluate the 'leading principal minors' of the matrix (ie, the determinants of the upper-left square submatrices of sizes 1, 2, 3,..., n). If they are all positive, the Hessian is positive definite and we have a minimum. If the first one is negative and they alternate sign, then the Hessian is negative definite and we have a maximum. This method is much simpler than calculating the eigenvalues, as it does not require us to actually solve any non-linear polynomial equation. 140.180.55.46 ( talk) 04:46, 16 November 2011 (UTC)
Its never mentioned in the article that this can be used as a critical step in Newton's Method (amongst other numerical methods of unconstrained optimization). Further, its an important issue that Newton's method not converge to a saddle point when this is not desired. In order to ensure that the method converges to a local minimum or maximum often the Hessian matrix is shifted sufficiently to pass the Second partial derivative test as desired; unfortunately, there is nothing mentioned in this article about how to do that, its not as simple as H = H + k*diagonalMatrix (or identity matrix), is it? Regardless, there should be some discussion on this. — Preceding unsigned comment added by 90.215.230.136 ( talk) 00:01, 11 August 2011 (UTC)
"don't overdo it"? lame... —Preceding
unsigned comment added by
82.81.248.49 (
talk)
19:15, 3 November 2007 (UTC)
I’d wonder what geometric interpretation the determinant has in this case. Does it even have one? Suppose we have a function in which M > 0 and fxx < 0 for all (x,y), then there can only exist one maximum? - Or can one conclude this one the basis of M > 0 alone?
Perhaps this should be further explained in the article!?.
Sorry for the bad English! --
83.72.7.63 (
talk)
14:19, 14 January 2008 (UTC)
Added a quick interpretation of it. Please help improve if it needs more clarification! 65.9.171.12 ( talk) 03:51, 11 October 2008 (UTC)
Yes it has one. The determinant of the hessian at a particular point is the same as the Gaussian curvature of the graph at that point. If it is positively curved (locally spherical), there are two cases to consider concave up or concave down (fxx>0 or fxx<0). If it is negative, there is only such surface, a saddle. — Preceding unsigned comment added by 66.215.204.74 ( talk) 06:31, 6 March 2012 (UTC)
In the geometric intuition example, what does it mean when it says "between the x and y axis"? How can you be between axes? 71.33.4.235 ( talk) 01:20, 28 April 2013 (UTC)
The example of finding the critical points of has an error.
Thus the point is not a critical point, while is.
Including a plot of would really aid understanding of the example.
Adamace123 ( talk) 14:14, 14 October 2008 (UTC)
Hessian_matrix#Second_derivative_test discusses the same thing as this article and even does it better. This page and that section should be somehow merged, I think. - X7q ( talk) 08:21, 15 May 2009 (UTC)
Please explain what M stands for in the geometric interpretation explantion of a 2-D hessian. — Preceding unsigned comment added by Shreya.bits ( talk • contribs) 04:36, 21 May 2013 (UTC)
In the section "Geometric interpretation in the two-variable case" a figure is desperately needed for the case where D<0 and both fxx and fyy have the same sign. This is the most interesting case and the most difficult to understand. A figure would really help here. - MATThematical ( talk) 09:36, 18 October 2015 (UTC)