Next: Direct Inversion of Iterative
Up: Second derivative methods
Previous: Newton Raphson and quasiNewton
Contents
Rational Function Optimization
While standard NewtonRaphson is based on the optimization on a quadratic model, by
replacing this quadratic model by a rational function approximation we obtain the RFO
method [130,131].^{2.6}

(2.76) 
The numerator in equation 1.76 is the quadratic model of equation 1.74.
The matrix in this numerator is the so called Augmented Hessian (AH). is the
Hessian (analytic or approximated). The matrix is a symmetric matrix that has to
be specified but normally is taken as the unit matrix .
The solution of RFO equation, that is, the displacement vector
that extremalizes
(i.e.
)
is obtained by diagonalization of the Augmented Hessian matrix
solving the dimensional eigenvalue equation 1.77

(2.77) 
and then the displacement vector
for the step is evaluated as

(2.78) 
where

(2.79) 
In equation 1.79, if one is interested in locating a minimum then ,
and for a transition structure . As the optimization process converges,
tends to 1 and
to 0.
Next: Direct Inversion of Iterative
Up: Second derivative methods
Previous: Newton Raphson and quasiNewton
Contents
Xavier Prat Resina
20040909