KU Math 725: Lecture 2/11/2026
On this page:
[theorem=%counter%] Let $n \ge 2$ and $G$ be loopless on vertex set $[n]$. Let $i, j \in V(G)$ and $L^{i,j}$ the reduced Laplacian matrix obtained by deleting the $i$th rot and $j$th column of $L$. Then
- $\tau(G) = (-1)^{i+j}\det(L^{i,j})$.
- Suppose $G$ is connected and $\lambda_1, \ldots, \lambda_{n-1}$ are the nonzero eigenvalues of $L$. Then $\tau(G) = (\lambda_1 \cdots \lambda_{n-1})/n$. [/theorem]
[proof] Proceed by double induction on $n = n(G)$ and $r = e(G)$. The base case is if $n = 2$. Then $G$ contains $r$ parallel edges, any of which is a spanning tree. Thus, $\tau(G) = r$. We also have
\[\mqty[r & -r \\ -r & r].\]Obviously both claims hold for any relevant $r$.
Now consider the inductive step $n > 2$ and $r > 0$. Assume that the result holds for all graphs with less than $n$ vertices or $n$ vertices and less than $r$ edges. Let $e \in E(G)$. Assume WLOG that the endpoints of $e$ are $1$ and $n$. Consider $L(G)$ and $L(G - e)$. We have
\[l_{i,j}^{G-e} = \begin{cases} l_{i,j}^G - 1 & \text{ if } i = j = 1 \text{ or } i = j = n, \\ l_{i,j}^G + 1 & \text{ if } i = 1, j = n \text{ or} i = n, j = 1, \\ l_{i,j} & \text{ otherwise}. \end{cases}\]When we delete the $n$th row and column, we obtain a replaced Laplacian that differs in only entry from the “original”. In particular,
\[l_{1,1}^G = l_{1,1}^{G-e} + 1.\]If we evaluate $\det L^{n,n}(G)$ and $\det L^{n,n}(G-e)$ by expanding along the top row, the difference is
\[\det L^{n,n}(G) - \det L^{n,n}(G - e) = \det\mqty[ l_{2,2}^G & \cdots & l_{2,n-1}^G \\ \vdots & & \vdots \\ l_{n-1,2}^G & \cdots & l_{n-1,n-1}^G ]\]We claim that the RHS is $\det L^{n-1,n-1}(G/e)$. Indeed, since $L(G/e)$ differs from $L(G)$ only in the rows/column corresponding vertex, the claim follows. The degrees of non-merged vertices and edges between non-merged vertices are not affected by contraction. So then we have
\[\begin{align*} \det L^{n,n}(G) &= \det L^{n,n}(G - e) + \det L^{n-1,n-1}(G/e) \\ &= \tau(G - e) + \tau(G/e) \\ &= \tau(G). \end{align*}\][/proof]
An alternative proof can be given using the following theorem.
[theorem=%counter% (Cauchy-Binet)] Let $m \ge p$, $A \in \RR^{p\times m}$, and $B \in \RR^{m\times p}$. For $S \subseteq [m]$ and $|S| = p$ let
\[\begin{align*} A_S &= p\times p \text{ submatrix of } A \text{ with columns } S, \\ B_S &= p\times p \text{ submatrix of } B \text{ with columns } S. \end{align*}\]Then
\[\det AB = \sum_{\substack{S \subseteq [m] \\ |S| = p}} (\det A_S)(\det B_S).\][/theorem]
[proof=Proof of Theorem via Cauchy-Binet] Let $N$ be the reduced incidence matrix formed by deleting the top row of the incidence matrix. One can show that $NN^T = L^{1,1}$. Let $S$ be a set of $n - 1$ edges of $G$ and consider the corresponding columns of $N$. Then $S$ either contains a cycle, or it is acyclic with $n - 1$ edges. That is, $S$ is a tree (in the acyclic case).
If $S$ contains a cycle, then Theorem 1.40 implies that $(B(G))_S$ is $n\times(n-1)$ with rank less than $n - 1$. Hence every minor of size $n-1$ is zero. Thus, $\det N_S = 0$.
If $S$ is acyclic, hence a spanning tree, then we claim that $\det N_S = \pm 1$. The tree corresponding to $S$ must have at least two leaves. Let $\ell$ be the index of one of these leaves with $\ell \neq 1$. Then $N_S$ has ony a single $\pm 1$ in row $\ell$.
If we expand the determinant of $N_S$ along this row, we get
\[\det N_S = \pm \det N_{S}'\]where $N_S’$ is a submatrix of $N_S$ gotten by deleeting row $l$ and column say $k$. Continue this process inductively, each time expressing the $\det$ as $\pm 1$ times the $\det$ of a smaller matrix. The induction stops at a tree with two vertices ($K_2$). Determinant at this base case is $\pm 1$.
Now apply Cauchy-Binet, with $p = n - 1$, $m = r$, $A = N$, $B = N^T$. Then
\[\det L^{1,1}(G) = \det NN^T = \sum_{\substack{S \subseteq E(G) \\ |S| = n-1}} (\det N_S)(\det N_S^T) = \sum (\det N_S)^2 = \sum_{S} n_S\]where $n_S = 1$ if $S$ is a spanning tree and $n_S = 0$ otherwise. [/proof]