[Click here for a PDF of this post with nicer formatting]

I’m having trouble understanding how to apply the LU procedure where pivoting is required. Proceeding with such a factorization, doesn’t produce an LU factorization that is the product of a lower triangular matrix and an upper triangular matrix. What we get instead is what looks like a permutation of a lower triangular matrix with an upper triangular matrix. As an example, consider the LU reduction of

\begin{equation}\label{eqn:luExample2:20}

M \Bx =

\begin{bmatrix}

0 & 0 & 1 \\

2 & 0 & 4 \\

1 & 1 & 1

\end{bmatrix}

\begin{bmatrix}

x_1 \\

x_2 \\

x_3 \\

\end{bmatrix}.

\end{equation}

We want \( r_2 \) as the pivot row, since it has the biggest first column value, so we can write

\begin{equation}\label{eqn:luExample2:40}

M \Bx \rightarrow M’ \Bx’

=

\begin{bmatrix}

2 & 0 & 4 \\

0 & 0 & 1 \\

1 & 1 & 1

\end{bmatrix}

\begin{bmatrix}

x_2 \\

x_1 \\

x_3 \\

\end{bmatrix}.

\end{equation}

We can express this permutation algebraically as a row permutation matrix operation

\begin{equation}\label{eqn:luExample2:60}

M’ =

\begin{bmatrix}

0 & 1 & 0 \\

1 & 0 & 0 \\

0 & 0 & 1

\end{bmatrix}

M.

\end{equation}

We can now proceed with the Gaussian reduction operations

\begin{equation}\label{eqn:luExample2:80}

\begin{aligned}

r_2 & \rightarrow r_2 – \frac{0}{2} r_1 \\

r_3 & \rightarrow r_3 – \frac{1}{2} r_1 \\

\end{aligned},

\end{equation}

which gives

\begin{equation}\label{eqn:luExample2:100}

M_1 =

\begin{bmatrix}

1 & 0 & 0 \\

0 & 1 & 0 \\

-1/2 & 0 & 1

\end{bmatrix}

M’

=

\begin{bmatrix}

1 & 0 & 0 \\

0 & 1 & 0 \\

-1/2 & 0 & 1

\end{bmatrix}

\begin{bmatrix}

2 & 0 & 4 \\

0 & 0 & 1 \\

1 & 1 & 1

\end{bmatrix}

=

\begin{bmatrix}

2 & 0 & 4 \\

0 & 0 & 1 \\

0 & 1 & -1

\end{bmatrix}.

\end{equation}

Application of one more permutation operation gives us the desired upper triangular matrix

\begin{equation}\label{eqn:luExample2:120}

U = M_1′ =

\begin{bmatrix}

1 & 0 & 0 \\

0 & 0 & 1 \\

0 & 1 & 0 \\

\end{bmatrix}

\begin{bmatrix}

2 & 0 & 4 \\

0 & 0 & 1 \\

0 & 1 & -1

\end{bmatrix}

=

\begin{bmatrix}

2 & 0 & 4 \\

0 & 1 & -1 \\

0 & 0 & 1

\end{bmatrix}.

\end{equation}

This new matrix operator applies to the permuted vector

\begin{equation}\label{eqn:luExample2:140}

\Bx” =

\begin{bmatrix}

x_2 \\

x_3 \\

x_1 \\

\end{bmatrix}.

\end{equation}

We’ve constructed \( U \) by the following row operations

\begin{equation}\label{eqn:luExample2:160}

U =

\begin{bmatrix}

1 & 0 & 0 \\

0 & 0 & 1 \\

0 & 1 & 0 \\

\end{bmatrix}

\begin{bmatrix}

1 & 0 & 0 \\

0 & 1 & 0 \\

-1/2 & 0 & 1

\end{bmatrix}

\begin{bmatrix}

0 & 1 & 0 \\

1 & 0 & 0 \\

0 & 0 & 1 \\

\end{bmatrix}

M.

\end{equation}

Seeking \( L U = M \), we want

\begin{equation}\label{eqn:luExample2:180}

L

\begin{bmatrix}

1 & 0 & 0 \\

0 & 0 & 1 \\

0 & 1 & 0 \\

\end{bmatrix}

\begin{bmatrix}

1 & 0 & 0 \\

0 & 1 & 0 \\

-1/2 & 0 & 1

\end{bmatrix}

\begin{bmatrix}

0 & 1 & 0 \\

1 & 0 & 0 \\

0 & 0 & 1 \\

\end{bmatrix}

M

= M,

\end{equation}

or

\begin{equation}\label{eqn:luExample2:200}

L

=

\begin{bmatrix}

0 & 1 & 0 \\

1 & 0 & 0 \\

0 & 0 & 1 \\

\end{bmatrix}

\begin{bmatrix}

1 & 0 & 0 \\

0 & 1 & 0 \\

1/2 & 0 & 1

\end{bmatrix}

\begin{bmatrix}

1 & 0 & 0 \\

0 & 0 & 1 \\

0 & 1 & 0 \\

\end{bmatrix}

=

\begin{bmatrix}

0 & 1 & 0 \\

1 & 0 & 0 \\

1/2 & 0 & 1

\end{bmatrix}

\begin{bmatrix}

1 & 0 & 0 \\

0 & 0 & 1 \\

0 & 1 & 0 \\

\end{bmatrix}

=

\begin{bmatrix}

0 & 0 & 1 \\

1 & 0 & 0 \\

1/2 & 1 & 0

\end{bmatrix}.

\end{equation}

The \( L U \) factorization that we’ve attempted does not appear to produce a lower triangular factor, but a permutation of a lower triangular factor?

When such pivoting is required it isn’t obvious, at least to me, how to do the clever \( L U \) algorithm that we outlined in class. How can we pack the operations into the lower triangle when we actually have to apply permutation matrices to the results of the last iteration.

It seems that while we cannot perform a \( LU \) decomposition of \( M \) we can perform an \( L U \) factorization of \( P M \), where \( P \) is the permutation matrix for the permutation \( 2,3,1 \) that we applied to the rows during the Gaussian operations.

Let’s do that \( L U \) factorization to check

\begin{equation}\label{eqn:luExample2:220}

P M =

\begin{bmatrix}

0 & 1 & 0 \\

0 & 0 & 1 \\

1 & 0 & 0 \\

\end{bmatrix}

\begin{bmatrix}

0 & 0 & 1 \\

2 & 0 & 4 \\

1 & 1 & 1

\end{bmatrix}

=

\begin{bmatrix}

2 & 0 & 4 \\

1 & 1 & 1 \\

0 & 0 & 1 \\

\end{bmatrix}.

\end{equation}

We want to apply the elementary row operation

\begin{equation}\label{eqn:luExample2:240}

r_2 \rightarrow r_2 – \frac{1}{2} r_1,

\end{equation}

for

\begin{equation}\label{eqn:luExample2:260}

\lr{P M}_1 =

\begin{bmatrix}

2 & 0 & 4 \\

0 & 1 & -1 \\

0 & 0 & 1 \\

\end{bmatrix},

\end{equation}

The \( L U \) factorization is therefore

\begin{equation}\label{eqn:luExample2:280}

P M =

L U =

\begin{bmatrix}

1 & 0 & 0 \\

1/2 & 1 & 0 \\

0 & 0 & 1 \\

\end{bmatrix}

\begin{bmatrix}

2 & 0 & 4 \\

0 & 1 & -1 \\

0 & 0 & 1 \\

\end{bmatrix}.

\end{equation}

Observe that we can also write this as

\begin{equation}\label{eqn:luExample2:300}

M = \lr{ P^{-1} L } U.

\end{equation}

In our case the inverse permutation is a \( 3,1,2 \) permutation matrix

\begin{equation}\label{eqn:luExample2:320}

P^{-1} =

\begin{bmatrix}

0 & 0 & 1 \\

1 & 0 & 0 \\

0 & 1 & 0 \\

\end{bmatrix}.

\end{equation}

We see that \( P^{-1} L \) produces the not-lower-triangular matrix factor found earlier in \ref{eqn:luExample2:200}.