In this paper, we develop a modified gradient based algorithm for solving matrix equations AXB + CXTD = F. Different from the gradient based method introduced by Xie et al., 2010, the information generated in the first half-iterative step is fully exploited and used to construct the approximate solution. Theoretical analysis shows that the new method converges under certain assumptions. Numerical results are given to verify the efficiency of the new method. [ABSTRACT FROM AUTHOR]
This paper proposes some diagonal matrices that approximate the (inverse) Hessian by parts using the variational principle that is analogous to the one employed in constructing quasi-Newton updates. The way we derive our approximations is inspired by the least change secant updating approach, in which we let the diagonal approximation be the sum of two diagonal matrices where the first diagonal matrix carries information of the local Hessian, while the second diagonal matrix is chosen so as to induce positive definiteness of the diagonal approximation at a whole. Some numerical results are also presented to illustrate the effectiveness of our approximating matrices when incorporated within the L-BFGS algorithm. [ABSTRACT FROM AUTHOR]
This paper focuses on developing diagonal gradient-type methods that employ accumulative approach in multistep diagonal updating to determine a better Hessian approximation in each step. The interpolating curve is used to derive a generalization of the weak secant equation, which will carry the information of the local Hessian. The new parameterization of the interpolating curve in variable space is obtained by utilizing accumulative approach via a norm weighting defined by two positive definite weighting matrices. We also note that the storage needed for all computation of the proposed method is just On. Numerical results show that the proposed algorithm is efficient and superior by comparison with some other gradient-type methods. [ABSTRACT FROM AUTHOR]