eigenvalues

Eigenvalues of 2×2 matrix: another identity seen on twitter.

December 11, 2024 math and physics play No comments , , , ,

[Click here for a PDF version of this post]

Here’s another interesting looking twitter math post, this time about 2×2 matrix eigenvalues:

Theorem 1.1: Eigenvalues of a 2×2 matrix.

Let \( m \) be the mean of the diagonal elements, and \( p \) be the determinant. The eigenvalues of the matrix are given by
\begin{equation*}
m \pm \sqrt{ m^2 – p }.
\end{equation*}

This is also not hard to verify.

Start proof:

Let
\begin{equation}\label{eqn:2x2eigen:20}
A =
\begin{bmatrix}
a & b \\
c & d
\end{bmatrix},
\end{equation}
where we are looking for \( \lambda \) that satisfies the usual zero determinant condition
\begin{equation}\label{eqn:2x2eigen:40}
\begin{aligned}
0
&= \Abs{ A – \lambda I } \\
&=
\begin{vmatrix}
a – \lambda & b \\
c & d – \lambda
\end{vmatrix} \\
&=
\lr{ a – \lambda } \lr{ d – \lambda } – b c \\
&=
a d – b c – \lambda \lr{ a + d } + \lambda^2 \\
&=
\mathrm{Det}{A} – \lambda \mathrm{Tr}{A} + \lambda^2 \\
&=
\lr{ \lambda – \frac{\mathrm{Tr}{A}}{2} }^2 + \mathrm{Det}{A} – \lr{ \frac{\mathrm{Tr}{A}}{2}}^2,
\end{aligned}
\end{equation}
so
\begin{equation}\label{eqn:2x2eigen:n}
\lambda = \frac{\mathrm{Tr}{A}}{2} \pm \sqrt{ \lr{ \frac{\mathrm{Tr}{A}}{2}}^2 – \mathrm{Det}{A} }.
\end{equation}
substitution of the variables in the problem statement finishes the proof.

End proof.

Clearly the higher dimensional characteristic equation will also have both a trace and determinant dependency as well, but the cross terms will be messier (and nobody wants to solve cubic or higher equations by hand anyways.)

bra-ket manipulation problems

July 22, 2015 phy1520 , , , , , , , ,

[Click here for a PDF of this post with nicer formatting]

Some bra-ket manipulation problems.([1] pr. 1.4)

Using braket logic expand

(a)

\begin{equation}\label{eqn:braketManip:20}
\textrm{tr}{X Y}
\end{equation}

(b)

\begin{equation}\label{eqn:braketManip:40}
(X Y)^\dagger
\end{equation}

(c)

\begin{equation}\label{eqn:braketManip:60}
e^{i f(A)},
\end{equation}

where \( A \) is Hermitian with a complete set of eigenvalues.

(d)

\begin{equation}\label{eqn:braketManip:80}
\sum_{a’} \Psi_{a’}(\Bx’)^\conj \Psi_{a’}(\Bx”),
\end{equation}

where \( \Psi_{a’}(\Bx”) = \braket{\Bx’}{a’} \).

Answers

(a)

\begin{equation}\label{eqn:braketManip:100}
\begin{aligned}
\textrm{tr}{X Y}
&= \sum_a \bra{a} X Y \ket{a} \\
&= \sum_{a,b} \bra{a} X \ket{b}\bra{b} Y \ket{a} \\
&= \sum_{a,b}
\bra{b} Y \ket{a}
\bra{a} X \ket{b} \\
&= \sum_{a,b}
\bra{b} Y
X \ket{b} \\
&= \textrm{tr}{ Y X }.
\end{aligned}
\end{equation}

(b)

\begin{equation}\label{eqn:braketManip:120}
\begin{aligned}
\bra{a} \lr{ X Y}^\dagger \ket{b}
&=
\lr{ \bra{b} X Y \ket{a} }^\conj \\
&=
\sum_c \lr{ \bra{b} X \ket{c}\bra{c} Y \ket{a} }^\conj \\
&=
\sum_c \lr{ \bra{b} X \ket{c} }^\conj \lr{ \bra{c} Y \ket{a} }^\conj \\
&=
\sum_c
\lr{ \bra{c} Y \ket{a} }^\conj
\lr{ \bra{b} X \ket{c} }^\conj \\
&=
\sum_c
\bra{a} Y^\dagger \ket{c}
\bra{c} X^\dagger \ket{b} \\
&=
\bra{a} Y^\dagger
X^\dagger \ket{b},
\end{aligned}
\end{equation}

so \( \lr{ X Y }^\dagger = Y^\dagger X^\dagger \).

(c)

Let’s presume that the function \( f \) has a Taylor series representation

\begin{equation}\label{eqn:braketManip:140}
f(A) = \sum_r b_r A^r.
\end{equation}

If the eigenvalues of \( A \) are given by

\begin{equation}\label{eqn:braketManip:160}
A \ket{a_s} = a_s \ket{a_s},
\end{equation}

this operator can be expanded like

\begin{equation}\label{eqn:braketManip:180}
\begin{aligned}
A
&= \sum_{a_s} A \ket{a_s} \bra{a_s} \\
&= \sum_{a_s} a_s \ket{a_s} \bra{a_s},
\end{aligned}
\end{equation}

To compute powers of this operator, consider first the square

\begin{equation}\label{eqn:braketManip:200}
\begin{aligned}
A^2 =
&=
\sum_{a_s} a_s \ket{a_s} \bra{a_s}
\sum_{a_r} a_r \ket{a_r} \bra{a_r} \\
&=
\sum_{a_s, a_r} a_s a_r \ket{a_s} \bra{a_s} \ket{a_r} \bra{a_r} \\
&=
\sum_{a_s, a_r} a_s a_r \ket{a_s} \delta_{s r} \bra{a_r} \\
&=
\sum_{a_s} a_s^2 \ket{a_s} \bra{a_s}.
\end{aligned}
\end{equation}

The pattern for higher powers will clearly just be

\begin{equation}\label{eqn:braketManip:220}
A^k =
\sum_{a_s} a_s^k \ket{a_s} \bra{a_s},
\end{equation}

so the expansion of \( f(A) \) will be

\begin{equation}\label{eqn:braketManip:240}
\begin{aligned}
f(A)
&= \sum_r b_r A^r \\
&= \sum_r b_r
\sum_{a_s} a_s^r \ket{a_s} \bra{a_s} \\
&=
\sum_{a_s} \lr{ \sum_r b_r a_s^r } \ket{a_s} \bra{a_s} \\
&=
\sum_{a_s} f(a_s) \ket{a_s} \bra{a_s}.
\end{aligned}
\end{equation}

The exponential expansion is

\begin{equation}\label{eqn:braketManip:260}
\begin{aligned}
e^{i f(A)}
&=
\sum_t \frac{i^t}{t!} f^t(A) \\
&=
\sum_t \frac{i^t}{t!}
\lr{ \sum_{a_s} f(a_s) \ket{a_s} \bra{a_s} }^t \\
&=
\sum_t \frac{i^t}{t!}
\sum_{a_s} f^t(a_s) \ket{a_s} \bra{a_s} \\
&=
\sum_{a_s}
e^{i f(a_s) }
\ket{a_s} \bra{a_s}.
\end{aligned}
\end{equation}

(d)

\begin{equation}\label{eqn:braketManip:n}
\begin{aligned}
\sum_{a’} \Psi_{a’}(\Bx’)^\conj \Psi_{a’}(\Bx”)
&=
\sum_{a’}
\braket{\Bx’}{a’}^\conj
\braket{\Bx”}{a’} \\
&=
\sum_{a’}
\braket{a’}{\Bx’}
\braket{\Bx”}{a’} \\
&=
\sum_{a’}
\braket{\Bx”}{a’}
\braket{a’}{\Bx’} \\
&=
\braket{\Bx”}{\Bx’} \\
&= \delta_{\Bx” – \Bx’}.
\end{aligned}
\end{equation}

References

[1] Jun John Sakurai and Jim J Napolitano. Modern quantum mechanics. Pearson Higher Ed, 2014.