-
Notifications
You must be signed in to change notification settings - Fork 4
/
15-00-SingularValueDecomposition.tex
68 lines (52 loc) · 2.58 KB
/
15-00-SingularValueDecomposition.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
\documentclass[12pt]{article}
\usepackage{pmmeta}
\pmcanonicalname{SingularValueDecomposition}
\pmcreated{2013-03-22 12:05:17}
\pmmodified{2013-03-22 12:05:17}
\pmowner{akrowne}{2}
\pmmodifier{akrowne}{2}
\pmtitle{singular value decomposition}
\pmrecord{9}{31171}
\pmprivacy{1}
\pmauthor{akrowne}{2}
\pmtype{Definition}
\pmcomment{trigger rebuild}
\pmclassification{msc}{15-00}
\pmclassification{msc}{65-00}
\pmsynonym{SVD}{SingularValueDecomposition}
\pmsynonym{singular value}{SingularValueDecomposition}
\pmsynonym{singular vector}{SingularValueDecomposition}
%\pmkeywords{matrix factorizations}
%\pmkeywords{numerical analysis}
%\pmkeywords{eigenvalues}
\pmrelated{Eigenvector}
\pmrelated{Eigenvalue}
\endmetadata
\usepackage{amssymb}
\usepackage{amsmath}
\usepackage{amsfonts}
\usepackage{graphicx}
%%%\usepackage{xypic}
\begin{document}
\PMlinkescapeword{singular}
Any real $m \times n$ matrix $A$ can be decomposed into
$$ A = USV^T $$
where $U$ is an $m \times m$ orthogonal matrix, $V$ is an $n \times n$ orthogonal matrix, and $S$ is a unique $m \times n$ diagonal matrix with real, non-negative elements $\sigma_i$, $i = 1, \ldots , \min(m,n)$ , in descending order:
$$ \sigma_1 \ge \sigma_2 \ge \dots \ge \sigma_{\min(m,n)} \ge 0 $$
The $\sigma_i$ are the \emph{singular values} of $A$ and the first $\min(m,n)$ columns of $U$ and $V$ are the left and right (respectively) \emph{singular vectors} of $A$. $S$ has the form:
$$ \begin{bmatrix}\Sigma \\ 0\end{bmatrix} \operatorname{if} m \ge n \;\operatorname{and}\; \begin{bmatrix}\Sigma & 0 \end{bmatrix} \operatorname{if} m < n,$$
where $\Sigma$ is a diagonal matrix with the diagonal elements $\sigma_1,\sigma_2,\ldots , \sigma_{\min(m,n)}$. We assume now $m \ge n$. If $r=\operatorname{rank}(A) < n $ , then
$$ \sigma_1 \ge \sigma_2 \ge \cdots \ge \sigma_r > \sigma_{r+1} = \cdots = \sigma_n = 0.$$
If $\sigma_r \ne 0$ and $\sigma_{r+1} = \cdots = \sigma_n = 0$, then $r$ is the rank of $A$. In this case, $S$ becomes an $r \times r$ matrix, and $U$ and $V$ shrink accordingly. SVD can thus be used for rank determination.
The SVD provides a numerically robust solution to the least-squares problem. The matrix-algebraic phrasing of the least-squares solution $x$ is
$$ x = (A^T A)^{-1} A^T b $$
Then utilizing the SVD by making the replacement $A=USV^T$ we have
$$ x = V \begin{bmatrix} \Sigma^{-1} & 0 \end{bmatrix} U^T b .$$
{\bf References}
\begin{itemize}
\item Originally from The Data Analysis Briefbook (\PMlinkexternal{http://rkb.home.cern.ch/rkb/titleA.html}{http://rkb.home.cern.ch/rkb/titleA.html})
\end{itemize}
%%%%%
%%%%%
%%%%%
\end{document}