213 - Factoring Matrices

67 days ago by Professor213

# First, create a matrix A and then use the built-in function LU to factor, allowing for the swapping of rows. A = matrix(QQ,[[3,-7,-2,2],[-3,5,1,0],[6,-4,0,-5],[-9,5,-5,12]]) show(A) P, L, U = A.LU(pivot='partial') show(P) show(L) show(U) print('hi') show(A) show(P.transpose()*L*U) print('hi') show([P*L*U,A]) print(P*L*U==A) # And check to see if the matrix is non-singular. A.determinant() 
       
hi
hi
True
-6
hi
hi
True
-6
# Next, apply the LU factorization do not allow rows to be swapped P, L, U = A.LU(pivot='nonzero') print('Permutation Matrix') show(P) print('L and U') show(L) show(U) print('Did it work? LU=A is',L*U==A) 
       
Permutation Matrix
L and U
('Did it work? LU=A is', True)
Permutation Matrix
L and U
('Did it work? LU=A is', True)
# Next, apply the LU factorization but to the transpose of the same matrix and do not allow rows to be swapped At = A.transpose() P0, L0, U0 = At.LU(pivot='nonzero') print('Permutation Matrix') show(P0) print('L and U') show(L0) show(U0) print('Did it work?',L0*U0==At) 
       
Permutation Matrix
L and U
('Did it work?', True)
Permutation Matrix
L and U
('Did it work?', True)
# Let's compare the two Doolittle factorization # One where you just do A # The second where you apply the LU factorization to A^t and then transpose back print('Two different LU factorizations for the same matrix A') show([L,U]) U0t = U0.transpose() L0t = L0.transpose() show([U0t,L0t]) show(A) show(L*U) show(U0t*L0t) 
       
Two different LU factorizations for the same matrix A

                                
                            
Two different LU factorizations for the same matrix A

                                

Doolittle's method returns a unit lower triangular matrix and an upper triangular matrix, while the Crout method returns a lower triangular matrix and a unit upper triangular matrix.

The book uses the Doolittle method for its examples. In class we are using the Crout method since it is simply the Gauss-Jordan process we all know and love by now. 

Note that $A = LU$ implies that $A^t = U^tL^t = L_0U_0$ which is a new lower-upper factorization but now has the ones on the upper-triangular matrix diagonal.  

So, you have four possible factorization of a matrix A. 

  1. Doolittle applied to A leaving 1's on L's diagonal
  2. Crout applied to A leaving 1's on U's diagonal
  3. Doolittle applied to $A^t$ and after transposing back leaving 1's on the lower matrix's diagaonal
  4. Crout applied to $A^t$ and after transposing back leaving 1's on the upper matrix's diagaonal

In each case you get something that looks like A = L U but the L's and U's are likely different.  

# Now, let's start over but apply the LU factorization but to the matrix P'*A P, L, U = PtA.LU() # to go back to the original A=PLU factorization show(A) show(P) show(L) show(U) PtA = P.transpose()*A print('So, here is a shuffling of the rows of A so that A=LU (with no P)') show(PtA) P, L, U = PtA.LU() show(P) show(L) show(U) 
       
So, here is a shuffling of the rows of A so that A=LU (with no P)

                                
                            
So, here is a shuffling of the rows of A so that A=LU (with no P)

                                

There are other factorizations for matrices.  The QR factorization creates a factorization where R is upper triangular but Q is a special type of matrix called an "orthogonal" matrix.  Orthogonal matrices have the property that $Q^{-1} = Q^t$ which makes solving with them very easy.  We will do orthogonal stuff in Chapter 6.

# And for later use (after doing the Gram-Schmidt orthogonalization) let's factor usign the QR algorithm A = matrix(QQbar, [[-2, 0, -4, -1, -1], [-2, 1, -6, -3, -1], [1, 1, 7, 4, 5], [3, 0, 8, 3, 3], [-1, 1, -6, -6, 5]]) Q, R = A.QR() show(Q) show(R) print('And check to see if the factorization worked...') Q*R==A print('We can show that the inverse of Q and Q^t are the same thing') show([Q.inverse(),Q.transpose()]) print('and since those number are terrible, we can just do a check: ',Q.inverse()==Q.transpose()) 
       
And check to see if the factorization worked...
We can show that the inverse of Q and Q^t are the same thing
('and since those number are terrible, we can just do a check: ', True)
And check to see if the factorization worked...
We can show that the inverse of Q and Q^t are the same thing
('and since those number are terrible, we can just do a check: ', True)
# This cell plays around with A=PLU but using octave/matlab/scilab format. # To evaluate, you must change the sage cell to an octave cell A = [2 -3 6 -1;2 5 0 3;4 1 -1 0;-2 0 -4 8] [L U P]=lu(A); format short # P*L*U-A # Solve Ax=b x0 = [2;7;0;-4]; b = A*x0; # Solve Pz = b z = P'*b # Solve Ly = z, using forward solving y = inv(L)*z # Finish by solving Ux = y, using backward solving x = inv(U)*y x-x0 
       
Traceback (click to the left of this block for traceback)
...
SyntaxError: invalid syntax
Traceback (most recent call last):    #  Solve Ax=b
  File "", line 1, in <module>
    
  File "/tmp/tmpyPzZf8/___code___.py", line 4
    A = [_sage_const_2  -_sage_const_3  _sage_const_6  -_sage_const_1 ;_sage_const_2  _sage_const_5  _sage_const_0  _sage_const_3 ;_sage_const_4  _sage_const_1  -_sage_const_1  _sage_const_0 ;-_sage_const_2  _sage_const_0  -_sage_const_4  _sage_const_8 ]
                                                    ^
SyntaxError: invalid syntax
# And here is a copied function in Matlab format that does Doolittle but without any pivoting function [L, U] = lu_nopivot(A) n = size(A, 1); % Obtain number of rows (should equal number of columns) L = eye(n); % Start L off as identity and populate the lower triangular half slowly for k = 1 : n % For each row k, access columns from k+1 to the end and divide by % the diagonal coefficient at A(k ,k) L(k + 1 : n, k) = A(k + 1 : n, k) / A(k, k); % For each row k+1 to the end, perform Gaussian elimination % In the end, A will contain U for l = k + 1 : n A(l, :) = A(l, :) - L(l, k) * A(k, :); end end U = A; end