Self-interlacing polynomials II: Matrices with self-interlacing spectrum

Main Article Content

Mikhail Tyaglov

Abstract

An $n\times n$ matrix is said to have a self-interlacing spectrum if its eigenvalues $\lambda_k$, $k=1,\ldots,n$, are distributed as follows: $$ \lambda_1>-\lambda_2>\lambda_3>\cdots>(-1)^{n-1}\lambda_n>0. $$ A method for constructing sign definite matrices with self-interlacing spectrum from totally nonnegative ones is presented. This method is applied to bidiagonal and tridiagonal matrices. In particular, a result by O. Holtz on the spectrum of real symmetric anti-bidiagonal matrices with positive nonzero entries is generalized.

Article Details

Section
Article