2000-11-26-decompositions.md 5.66 KB
 Ben Huber committed May 16, 2017 1 2 ``````--- layout: post `````` Ben Huber committed May 23, 2017 3 ``````title: "Decompositions and Solve" `````` Ben Huber committed May 16, 2017 4 5 6 7 8 ``````date: 2000-11-26 topic: "Basic Usage" section: "Documentation" --- __tabsInit `````` Ben Huber committed May 23, 2017 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 ``````# Decompositions and Solve In matrix calculus the decomposition of a matrix into the matrix product of several matrices with special properties (eg. into an orthogonal and a triangular matrix (QR) or orthogonal matrices and diagonal of singular values (SVD)) are among the most powerful tools to devise numerical algorithms. In the case of tensors of higher degree it is necessary to indicate along which modes the decomposition is supposed to happen, so `xerus` uses the notation of indexed equations explained in the previous chapter ([Indices and Equations](indices)). ## QR Decompositions To provide an intuitive approach to decompositions, `xerus` uses the assignment of multiple tensors with a single operator to denote them. Here `(Q, R) = QR(A)` reads as "Q and R are defined as the QR-Decomposition of A", even though we naturally have to provide indices to make this line well defined: __tabsStart ~~~ cpp // perform QR decomposition of A and store result in Q and R (Q(i,r), R(r,j)) = xerus.QR(A(i,j)); ~~~ __tabsMid ~~~ py # perform QR decomposition of A and store result in Q and R (Q(i,r), R(r,j)) << xerus.QR(A(i,j)) ~~~ __tabsEnd In these decompositions we can distribute the modes of the original tensor as we please, but to be well defined, `Q` and `R` must only share one and exactly one index. __tabsStart ~~~ cpp // well defined QR decomposition of a degree 4 tensor (Q(i,r,k), R(l,r,j)) = xerus.QR(A(i,j,k,l)); // invalid: (Q(i,r,s), R(r,s,j)) = xerus.QR(A(i,j)); ~~~ __tabsMid ~~~ py # well defined QR decomposition of a degree 4 tensor (Q(i,r,k), R(l,r,j)) << xerus.QR(A(i,j,k,l)) # invalid: (Q(i,r,s), R(r,s,j)) << xerus.QR(A(i,j)) ~~~ __tabsEnd For convenience `xerus` defines four variants of the QR decomposition. Assuming the input is of size \$m\times n\$, \$min = \operatorname{min}(m,n)\$ and \$r\$ the rank of the input we have following resulting objects:
Left-Hand-Side Right-Hand-Side
Decomposition PropertyDimensionsPropertyDimensions
`````` Ben Huber committed May 24, 2017 66 `````` `````` Ben Huber committed May 23, 2017 67 `````` `````` Ben Huber committed May 24, 2017 68 `````` `````` Ben Huber committed May 23, 2017 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 ``````
xerus.QR orthogonal \$m\times min\$ upper triangular \$min\times n\$
xerus.RQ upper triangular \$m\times min\$ orthogonal \$min\times n\$
xerus.QC orthogonal \$m\times r\$ upper or lower triangular \$r\times n\$
xerus.CQ upper or lower triangular \$m\times r\$ orthogonal \$r\times n\$
## Singular Value Decompositions The Singular Value Decomposition in `xerus` is called very much like the `QR` decomposition: __tabsStart ~~~ cpp // calculate the SVD of A and store the resulting matrices in U, S and Vt (U(i,r1), S(r1,r2), Vt(r2,j)) = xerus.SVD(A(i,j)); ~~~ __tabsMid ~~~ py # calculate the SVD of A and store the resulting matrices in U, S and Vt (U(i,r1), S(r1,r2), Vt(r2,j)) << xerus.SVD(A(i,j)) ~~~ __tabsEnd In this form it is rank-revealing (so `S` is of dimensions \$r\times r\$ instead of \$\operatorname{min}(m,n)\times\operatorname{min}(m,n)\$) and exact, but it is possible to pass optional arguments to use it as a truncated SVD. __tabsStart ~~~ cpp // calculate the SVD, truncated to at most 5 singular values size_t numSingularVectors = 5; // or until a singular value is smaller than 0.01 times the maximal singular value double epsilon = 0.01; (U(i,r1), S(r1,r2), Vt(r2,j)) = xerus.SVD(A(i,j), numSingularVectors, epsilon); ~~~ __tabsMid ~~~ py # calculate the SVD, truncated to at most 5 singular values numSingularVectors = 5 # or until a singular value is smaller than 0.01 times the maximal singular value epsilon = 0.01 (U(i,r1), S(r1,r2), Vt(r2,j)) = xerus.SVD(A(i,j), maxRank=numSingularVectors, eps=epsilon); ~~~ __tabsEnd ## Solving Linear Equations A common application for matrix decompositions is to solve matrix equations of the form \$A\cdot x = b\$ for \$x\$ via QR, LU, LDL\$^T\$ or Cholesky decompositions. In xerus this is again provided in indexed notation via the `operator/`. __tabsStart ~~~ cpp // solve A(i,j)*x(j) = b(i) for x x(j) = b(i) / A(i,j); ~~~ __tabsMid ~~~ py # solve A(i,j)*x(j) = b(i) for x x(j) << b(i) / A(i,j) ~~~ __tabsEnd Depending on the representation and some properties of `A`, `xerus` will automatically choose one of the above mentioned decompositions to solve the system.``````