1 今天,看了一下avstar的论文。。没有发现使用foot constraints的方法。。真是崩溃。。
2 上午上课,矩阵分析与应用。。。。很扯的一节课。。整理了一下课堂讲义。。等有时间一起发出来吧。。
3 晚上听了一个Convex Optimization的Stanford Course。。。恩,收获还是挺大的。。。
4 下午去办牡丹卡。。
5 看了一下PCA。。没搞的太明白。。。明天继续。。
明日:
1 论文
2 PCA
3 把课程的讲义整理一下。。不能白看啊。。
整理的笔记:
\documentclass[11pt]{article}
\usepackage{ctex}
\usepackage{graphicx}
\usepackage{float}
\DeclareGraphicsRule{.jpg}{eps}{.bb}{}
\DeclareGraphicsRule{.png}{eps}{.bb}{}
\begin{document}
\title{Matrix Analysis and Application}
\author{Sosi}
\maketitle
\section{Jacobian and Hessian Matrix}
\begin{enumerate}
\item Jacobian Matrix\\
the Jacobian matrix is the matrix of all first-order partial derivatives
of a vector or scalar-value function with respect to another vector. Suppose
$F:R^n \rightarrow R^m $is a function from Euclidean n-space to Euclidean m-space.Such a function is given by $m$ real-value component function,$y_1(x_1,...,x_n),...,y_m(x_1,...,x_n)$.The partial derivatives of
all these functions (if they exit) can be organized in a $m-by-n$matrix,the Jacobian matrix $J$ of $F$,as follows:
\begin{displaymath}
\mathbf{X} =
\left( \begin{array}{ccc}
x_{11} & x_{12} & \ldots \\
x_{21} & x_{22} & \dots \\
\vdots & \vdots & \ddots
\end{array} \right)
\end{displaymath}
\begin{displaymath}
\mathbf{J} =
\left( \begin{array}{ccc}
\frac{ \partial y_1}{ \partial x_1}
& \ldots &
\frac{{\partial}y_1}{{\partial} x_n} \\ %求导使用的是\prime 偏导使用的是\partial
\vdots & \ddots &\vdots\\
\frac{ {\partial} y_m}{ {\partial} x_1}
& \ldots &
\frac{ {\partial} y_m}{ {\partial} x_n}
\end{array}
\right)
\end{displaymath}
\item Hessian Matrix\\
in mathematics, the Hessian matrix is the square matrix of second-order
partial derivatives of a function; that is, it describes the local curvatrue of a function of many variables.The Hessian matrix was developed in the 19th century by the German mathematician Hesse and later named after him.
\begin{displaymath}
\mathbf{H(f)} =
\left( \begin{array}{cccc}
\frac{ \partial ^2 f}{ \partial x_1^2}
&
\frac{ \partial ^2 f}{ \partial x_1 \partial x_2}
& \ldots &
\frac{{\partial}^2 f}{{\partial}x_1 {\partial}x_n} \\ %求导使用的是\prime 偏导使用的是\partial
\frac{ \partial ^2 f}{ \partial x_2{\partial}x_1}
&
\frac{ \partial ^2 f}{ \partial x_2^2 }
& \ldots &
\frac{{\partial}^2 f}{{\partial}x_2 {\partial}x_n} \\
\vdots &\vdots\ \ddots &\vdots\\
\frac{ \partial ^2 f}{ \partial x_n{\partial}x_1}
&
\frac{ \partial ^2 f}{ \partial x_n \partial x_2}
& \ldots &
\frac{{\partial}^2 f}{{\partial}x_n^2} \\
\end{array}
\right)
\end{displaymath}
Hessian matrices are used in large-scale optimization problems with Newton-type methods because they are the coefficient of the quadratic term of a local Taylor expansion of a function.That is
\begin{displaymath}
y=f(x+\Delta x)\approx f(x)+J(x)\Delta x +\frac{1}{2}\Delta x^TH(x)\Delta x
\end{displaymath}
\end{enumerate}
\section{对称矩阵性质}
一种对称矩阵的定义方法:
\begin{displaymath}
Ax=\bigtriangledown f(x) \\f(x)=\frac{x^TAx}{2}
\end{displaymath}
\section{证明}
有限维线性空间任意两个极大无关组所含向量个数相同\\
证明:
$ (x_1,...,x_n)=(y_1,...,y_m)A $\\
$(y_1,...,y_m)=(x_1,...,x_n)B$\\
$(y_1,...,y_m)= (y_1,...,y_m)AB$\\
$AB=E_n$\\
$BA=E_m$\\
$tract(AB)=trace(BA)$\\
$m=n$\\
\section{性质}
过渡矩阵非奇异\\
A为奇异矩阵的定义是:存在非零n维向量使得$Ax=0$
\end{document}