Adapted AZNN methods for time-varying and static matrix problems

Main Article Content

Frank Uhlig
https://orcid.org/0000-0002-7495-5753

Abstract

We present adapted Zhang neural networks (AZNN) in which the parameter settings for the exponential decay constant $\eta$ and the length of the start-up phase of basic ZNN are adapted to the problem at hand. Specifically, we study experiments with AZNN for time-varying square matrix factorizations as a product of time-varying symmetric matrices and for the time-varying matrix square roots problem. Differing from generally used small $\eta$ values and minimal start-up length phases in ZNN, we adapt the basic ZNN method to work with large or even gigantic $\eta$ settings and arbitrary length start-ups using Euler's low accuracy finite difference formula. These adaptations improve the speed of AZNN's convergence and lower its solution error bounds for our chosen problems significantly to near machine constant or even lower levels. Parameter-varying AZNN also allows us to find full rank symmetrizers of static matrices reliably, for example, for the Kahan and Frank matrices and for matrices with highly ill-conditioned eigenvalues and complicated Jordan structures of dimensions from $n = 2$ on up. This helps in cases where full rank static matrix symmetrizers have never been successfully computed before.

Article Details

Section
Article