# Continuous Quantum Walks: Problems

## 1. Linear Algebra

1. Prove that the eigenvalues of a unitary matrix are complex numbers of norm one.
2. If $M\pge0$, show that $v^*Mv=0$ if and only if $Mv=0$.
3. Assume square matrices $A$ and $B$ are diagonalizable. Show that $(A\otimes I)+(I\otimes B)$ is diagonalizable and determine its eigenvalues in terms of those of $A$ and $B$. [$A$ and $B$ need not have the same order, so the two matrices $I$ here might not be equal.]
4. Prove a matrix $M$ is normal if and only if $M^*$ is a polynomial in $M$.
5. Assume $M$ is normal with spectral decomposition $M = \sum_r \la_rF_r.$ Explicitly describe a polynomial $p_r(t)$ such that $p_r(M)=F_r$.
6. If $M$ is normal and $N$ commutes with each matrix that commutes with $M$, prove that $N$ is a polynomial in $M$. [Remark: this holds even when $M$ is not normal, but this is much harder to prove.]
7. If $\seq P1m$ are orthogonal projections, show that if $\sum_r P_r=I$ then $P_rP_s=0$.
8. Suppose $\seq P1m$ are positive semidefinite $n\times n$ matrices and $\sum_r \tr(P_r)=n$. Show that is $\sum_r P_r=I$, then $P_r^2=P_r$ (for all $r$) and $P_rP_s=0$ if $r\ne s$.
9. If $A$ and $B$ are commuting square matrices, prove that $\exp(A+B)=\exp(A)\exp(B)$. Given an example to show that this fails if $A$ and $B$ do not commute.
10. Prove that for any square matrix $M$, we have $\det(\exp(M))=\exp(\tr(M))$. [Remark: you may assume $M$ is a matrix over $\cx$, but the result holds in much greater generality. For example, over the ring of Laurent series in $t$ with $n\times n$ matrices as coefficients.]
11. Show that a graph is regular if and only if its adjacency matrix commutes with the all-ones matrix $J$.
12. [withdrawn]
13. A complex matrix $M$ is normal if $MM^*=M^*M$. Prove that the following assertions are equivalent:
1. $M$ is normal.
2. $M$ is unitarily diagonalizable. ($M=L^*DL$ with $L$ unitary and $D$ diagonal.)
3. $M$ has a spectral decomposition $M=\sum_r \la_r F_r$ (where the matrices $F_r$ are Hermitian and idempotent).
4. For all vectors $x$ we have $\ip{Ax}{Ax}=\ip{A^*x}{A^*x}$.
14. A matrix $M$ is positive semidefinite if it is Hermitian and $x^*Mx\ge0$ for all $x$. Prove that the following assertions are equivalent:
1. $M$ is positive semidefinite.
2. There is a matrix $N$ such that $M=NN^*$.
3. The eigenvalues of $M$ are non-negative.
15. If $M$ is positive semidefinite and $x^*Mx=0$, prove that $Mx=0$. Also prove that if $\ip{M}{xx^*}=0$, then $Mx=0$.
16. If $M$ and $N$ are complex matrices of the same shape, we define their inner product $\ip{M}{N}$ to be $\tr(M^*N)$. Prove that a matrix is positive semidefinite if and only there are vectors $\seq z1k$ such that $M=\sum_r z_rz_r^*$. Using this, prove that $M$ is positive semidefinite if and only if $\ip{M}{N}\ge0$ for all positive semidefinite matrices $N$. [This is a key fact in semidefinite optimization.]
17. If $X$ and $Y$ are graphs with respective adjacency matrices, their direct product $X\times Y$ is the graph with adjacency matrix $A\otimes B$. Show that if we have spectral decompositions $A =\sum_r \la_rE_r,\qquad B = \sum_s \mu_r F_r$ then the spectral decomposition of $A\otimes B$ is $A\otimes B = \sum_{r,s} \la_r\mu_s E_r\otimes F_s.$

## 2. Walks

1. Prove that perfect state transfer does not occur on the Cartesian product of $P_2$ with $P_3$.
2. Prove that perfect state transfer does not occur on the cycle $C_6$.
3. Let $X$ be a connected graph on at least two vertices. If $a\in V(X)$, show that its eigenvalue support has size at least two. Show that if equality holds, the valency of $a$ is $|V(X)|-1$ and $X\setminus a$ is regular.
4. Look up spectral radius and Perron vector. Show that the eigenvalue support of a vertex in a connected graph must contain the spectral radius.
5. If $X$ is bipartite, prove that the eigenvalue support of a vertex is closed under multiplication by $-1$.
6. The eccentricity of a vertex $a$ is the maximum distance from $a$ of a vertex in $X$. Prove that the eccentricity of $a$, plus 1, is a lower bound on the size of the eigenvalue support of $a$. [It might help to look up a proof that the diameter, plus 1, is a lower bound on the number of distinct eigenvalues of $A$.]
7. Suppose $a$ and $b$ are distinct cospectral vertices in the graph $X$. Show that if all eigenvalues of $X$ are simple, $a$ and $b$ are strongly cospectral.
8. Show that the continuous quantum walk on $X$ starting at $a$ is periodic at time $t$ if and only if $U(t)$ commutes with the matrix $e_ae_a^T$.
9. Suppose $X$ is connected and the eigenvalue support of the vertex $a$ is closed under multiplication by $-1$. Prove that if $k$ is odd, the number of closed walks of length $k$ that start at $a$ is zero. Deduce from this that $X$ is bipartite.
10. Let $W$ be the subspace of $\re^n$ spanned by the vectors $A^ke_a$ (the walk module generated by $e_a$). The minimal polynomial of $A$ relative to $e_a$ is the monic polynomial $\psi_t$ of least degree such that $\psi(A)e_a=0$. Prove that $\psi$ is unique, show that its zeros are simple, and that the set of zeros of $\psi$ is the eigenvalue support of $a$.[Remark: we can compute $\psi$ in polynomial time.]
11. Prove that if $a$ and $b$ are cospectral vertices, then $(A^k)_{a,a}=(A^r)_{b,b}$ for all $k$. [One approach is to show first that $(E_r)_{a,a}=(E_r)_{b,b}$ for all $r$.]
12. Prove that if $a$ and $b$ are strongly cospectral vertices, there is an orthogonal matrix $Q$, such that
1. $Qe_a=e_b$.
2. $Q^2=I$.
3. $Q$ is a polynomial in $A$.
13. Assume $X$ is regular and that its complement is connected. Prove that if $a$ and $b$ are strongly cospectral in $X$, they are strongly cospectral in $\bar X$.
14. Let $U(t)$ be the transition matrix for the path $P_5$, with vertex $\{0,\ldots,4\}$. Show that if $t>0$, then $U(t)_{0,4}$ is not zero. (The characteristic polynomial of $P_5$ is $t(t^2-1)(t^2-3)$.)
15. Show that the continuous random walk on $P_3$ relative to the Laplacian does not admit pgst.
16. If the (real) matrix $B$ is tridiagonal, prove that there is a real diagonal matrix $D$ such that $D^{-1}BD$ is symmetric.
17. Let $A$ be a real symmetric matrix of order $n\times n$, and assume $u$ is a vector in $\re^n$ such that the space spanned $U$ by the vectors $A^ru$ for $r\ge0$ has dimension $m$. Deduce that the vectors $u, Au,\ldots,A^{m-1}u$ are linearly independent. Show that the matrix that represents the action of $A$ on $U$ is a companion matrix (google). Now let $u_0,\ldots,u_{m-1}$ be the orthogonal basis for $U$ we get by applying the Gram-Schmidt process to our first basis. Prove that the matrix that represents the action of $A$ on $U$ relative to this basis is tridiagonal.