## 1. Linear Algebra

- Prove that the eigenvalues of a unitary matrix are complex numbers of norm one.
- If $M\pge0$, show that $v^*Mv=0$ if and only if $Mv=0$.
- Assume square matrices $A$ and $B$ are diagonalizable. Show that $(A\otimes I)+(I\otimes B)$ is diagonalizable and determine its eigenvalues in terms of those of $A$ and $B$. [$A$ and $B$ need not have the same order, so the two matrices $I$ here might not be equal.]
- Prove a matrix $M$ is normal if and only if $M^*$ is a polynomial in $M$.
- Assume $M$ is normal with spectral decomposition \[ M = \sum_r \la_rF_r.\] Explicitly describe a polynomial $p_r(t)$ such that $p_r(M)=F_r$.
- If $M$ is normal and $N$ commutes with each matrix that commutes with $M$, prove that $N$ is a polynomial in $M$. [Remark: this holds even when $M$ is not normal, but this is much harder to prove.]
- If $\seq P1m$ are orthogonal projections, show that if $\sum_r P_r=I$ then $P_rP_s=0$.
- Suppose $\seq P1m$ are positive semidefinite $n\times n$ matrices and $\sum_r \tr(P_r)=n$. Show that is $\sum_r P_r=I$, then $P_r^2=P_r$ (for all $r$) and $P_rP_s=0$ if $r\ne s$.
- If $A$ and $B$ are commuting square matrices, prove that $\exp(A+B)=\exp(A)\exp(B)$. Given an example to show that this fails if $A$ and $B$ do not commute.
- Prove that for any square matrix $M$, we have $\det(\exp(M))=\exp(\tr(M))$. [Remark: you may assume $M$ is a matrix over $\cx$, but the result holds in much greater generality. For example, over the ring of Laurent series in $t$ with $n\times n$ matrices as coefficients.]
- Show that a graph is regular if and only if its adjacency matrix commutes with the all-ones matrix $J$.
- [withdrawn]
- A complex matrix $M$ is
*normal*if $MM^*=M^*M$. Prove that the following assertions are equivalent:- $M$ is normal.
- $M$ is unitarily diagonalizable. ($M=L^*DL$ with $L$ unitary and $D$ diagonal.)
- $M$ has a spectral decomposition $M=\sum_r \la_r F_r$ (where the matrices $F_r$ are Hermitian and idempotent).
- For all vectors $x$ we have $\ip{Ax}{Ax}=\ip{A^*x}{A^*x}$.

- A matrix $M$ is
*positive semidefinite*if it is Hermitian and $x^*Mx\ge0$ for all $x$. Prove that the following assertions are equivalent:- $M$ is positive semidefinite.
- There is a matrix $N$ such that $M=NN^*$.
- The eigenvalues of $M$ are non-negative.

- If $M$ is positive semidefinite and $x^*Mx=0$, prove that $Mx=0$. Also prove that if $\ip{M}{xx^*}=0$, then $Mx=0$.
- If $M$ and $N$ are complex matrices of the same shape, we define their inner product $\ip{M}{N}$ to be $\tr(M^*N)$. Prove that a matrix is positive semidefinite if and only there are vectors $\seq z1k$ such that $M=\sum_r z_rz_r^*$. Using this, prove that $M$ is positive semidefinite if and only if $\ip{M}{N}\ge0$ for all positive semidefinite matrices $N$. [This is a key fact in semidefinite optimization.]
- If $X$ and $Y$ are graphs with respective adjacency matrices, their
*direct product*$X\times Y$ is the graph with adjacency matrix $A\otimes B$. Show that if we have spectral decompositions \[ A =\sum_r \la_rE_r,\qquad B = \sum_s \mu_r F_r\] then the spectral decomposition of $A\otimes B$ is \[ A\otimes B = \sum_{r,s} \la_r\mu_s E_r\otimes F_s.\]

## 2. Walks

- Prove that perfect state transfer does not occur on the Cartesian product of $P_2$ with $P_3$.
- Prove that perfect state transfer does not occur on the cycle $C_6$.
- Let $X$ be a connected graph on at least two vertices. If $a\in V(X)$, show that its eigenvalue support has size at least two. Show that if equality holds, the valency of $a$ is $|V(X)|-1$ and $X\setminus a$ is regular.
- Look up spectral radius and Perron vector. Show that the eigenvalue support of a vertex in a connected graph must contain the spectral radius.
- If $X$ is bipartite, prove that the eigenvalue support of a vertex is closed under multiplication by $-1$.
- The
*eccentricity*of a vertex $a$ is the maximum distance from $a$ of a vertex in $X$. Prove that the eccentricity of $a$, plus 1, is a lower bound on the size of the eigenvalue support of $a$. [It might help to look up a proof that the diameter, plus 1, is a lower bound on the number of distinct eigenvalues of $A$.] - Suppose $a$ and $b$ are distinct cospectral vertices in the graph $X$. Show that if all eigenvalues of $X$ are simple, $a$ and $b$ are strongly cospectral.
- Show that the continuous quantum walk on $X$ starting at $a$ is periodic at time $t$ if and only if $U(t)$ commutes with the matrix $e_ae_a^T$.
- Suppose $X$ is connected and the eigenvalue support of the vertex $a$ is closed under multiplication by $-1$. Prove that if $k$ is odd, the number of closed walks of length $k$ that start at $a$ is zero. Deduce from this that $X$ is bipartite.
- Let $W$ be the
subspace of $\re^n$ spanned by the vectors $A^ke_a$ (the
*walk module*generated by $e_a$). The*minimal polynomial of $A$ relative to $e_a$*is the monic polynomial $\psi_t$ of least degree such that $\psi(A)e_a=0$. Prove that $\psi$ is unique, show that its zeros are simple, and that the set of zeros of $\psi$ is the eigenvalue support of $a$.[Remark: we can compute $\psi$ in polynomial time.] - Prove that if $a$ and $b$ are cospectral vertices, then $(A^k)_{a,a}=(A^r)_{b,b}$ for all $k$. [One approach is to show first that $(E_r)_{a,a}=(E_r)_{b,b}$ for all $r$.]
- Prove that if $a$ and $b$ are strongly cospectral vertices, there is an orthogonal
matrix $Q$, such that
- $Qe_a=e_b$.
- $Q^2=I$.
- $Q$ is a polynomial in $A$.

- Assume $X$ is regular and that its complement is connected. Prove that if $a$ and $b$ are strongly cospectral in $X$, they are strongly cospectral in $\bar X$.
- Let $U(t)$ be the transition matrix for the path $P_5$, with vertex $\{0,\ldots,4\}$. Show that if $t>0$, then $U(t)_{0,4}$ is not zero. (The characteristic polynomial of $P_5$ is $t(t^2-1)(t^2-3)$.)
- Show that the continuous random walk on $P_3$ relative to the Laplacian does not admit pgst.
- If the (real) matrix $B$ is tridiagonal, prove that there is a real diagonal matrix $D$ such that $D^{-1}BD$ is symmetric.
- Let $A$ be a real symmetric matrix of order $n\times n$, and assume $u$ is a vector in $\re^n$ such that the space spanned $U$ by the vectors $A^ru$ for $r\ge0$ has dimension $m$. Deduce that the vectors \[ u, Au,\ldots,A^{m-1}u \] are linearly independent. Show that the matrix that represents the action of $A$ on $U$ is a companion matrix (google). Now let \[ u_0,\ldots,u_{m-1} \] be the orthogonal basis for $U$ we get by applying the Gram-Schmidt process to our first basis. Prove that the matrix that represents the action of $A$ on $U$ relative to this basis is tridiagonal.