\documentclass[10pt]{article}
\usepackage{fullpage}
\usepackage{setspace}
\usepackage{parskip}
\usepackage{titlesec}
\usepackage[section]{placeins}
\usepackage{xcolor}
\usepackage{breakcites}
\usepackage{lineno}
\usepackage{hyphenat}
\PassOptionsToPackage{hyphens}{url}
\usepackage[colorlinks = true,
linkcolor = blue,
urlcolor = blue,
citecolor = blue,
anchorcolor = blue]{hyperref}
\usepackage{etoolbox}
\makeatletter
\patchcmd\@combinedblfloats{\box\@outputbox}{\unvbox\@outputbox}{}{%
\errmessage{\noexpand\@combinedblfloats could not be patched}%
}%
\makeatother
\usepackage[round]{natbib}
\let\cite\citep
\renewenvironment{abstract}
{{\bfseries\noindent{\abstractname}\par\nobreak}\footnotesize}
{\bigskip}
\titlespacing{\section}{0pt}{*3}{*1}
\titlespacing{\subsection}{0pt}{*2}{*0.5}
\titlespacing{\subsubsection}{0pt}{*1.5}{0pt}
\usepackage{authblk}
\usepackage{graphicx}
\usepackage[space]{grffile}
\usepackage{latexsym}
\usepackage{textcomp}
\usepackage{longtable}
\usepackage{tabulary}
\usepackage{booktabs,array,multirow}
\usepackage{amsfonts,amsmath,amssymb}
\providecommand\citet{\cite}
\providecommand\citep{\cite}
\providecommand\citealt{\cite}
% You can conditionalize code for latexml or normal latex using this.
\newif\iflatexml\latexmlfalse
\providecommand{\tightlist}{\setlength{\itemsep}{0pt}\setlength{\parskip}{0pt}}%
\AtBeginDocument{\DeclareGraphicsExtensions{.pdf,.PDF,.eps,.EPS,.png,.PNG,.tif,.TIF,.jpg,.JPG,.jpeg,.JPEG}}
\usepackage[utf8]{inputenc}
\usepackage[ngerman,greek,english]{babel}
\begin{document}
\title{PRINCIPAL~ MANIFOLDS AND NONLINEAR DIMENSION~ REDUCTION VIA LOCAL
TANGENT SPACE ALIGNMENT~~~~~~~~}
\author[1]{FAROUK NEHAL}%
\affil[1]{Université de Versailles Saint-Quentin}%
\vspace{-1em}
\date{\today}
\begingroup
\let\center\flushleft
\let\endcenter\endflushleft
\maketitle
\endgroup
\sloppy
\section*{\texorpdfstring{\textbf{1-Introduction}}{1-Introduction}}
{\label{391140}}
~ ~ ~ ~This article propose a new algorithm (LTSA) for nonlinear
manifold learning and nonlinear dimension reduction. it's tite is
``\textbf{PRINCIPAL MANIFOLDS AND NONLINEAR DIMENSION REDUCTION VIA
LOCAL TANGENT SPACE ALIGNMENT} ''. Was Published in 2002 by SIAM Journal
of Scientific . The~\emph{SIAM Journal on Scientific Computing}
~contains research articles on numerical methods and techniques for
scientific computation. Papers address computational issues relevant to
the solution of scientific or engineering problems and include
computational results demonstrating the effectiveness of the proposed
techniques. It was written by Zhenyue Zhang , et Hongyuan Zha .
\par\null
\textbf{hongyuan Zha} was a faculty member of the Department of Computer
Science and Engineering at Pennsylvania State University from 1992 to
2006, and he worked from 1999 to 2001 at Inktomi Corporation. Zha's
current research interests include computational mathematics and machine
learning applications.~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~~\textbf{Zhenyue
Zhang} is a mathematical researcher in the Department of Mathematics
Research at Zhejiang * China University.~ ~ The authors had presented,
in this article manifold learning and dimension reduction and he
discusses the issue of learning local geometry using tangent spaces,
then he shows how to align those local tangent spaces in order to learn
the global coordinate system of the underlying manifold, finaly he based
for algorithm local tangent space alignment (LTSA) algorithm and
comparite with the local linear embedding (LLE)in order to resume the
article, I start with exposing the scientific context at first,
secondly, I will try to highlight the most important existing works; and
the position of the article. Thirdly, the contribution in the scientific
world, and finally I end my resume with showing the results and
experiments of the method
\section*{2 - Context of the work}
{\label{943824}}
~ ~ ~In recent years, a variety of nonlinear dimensionality reduction
techniques have been proposed that aim to overcome the limitations of
traditional techniques such as PCA~ and other dimension reduction .
The manifold learning~ is the approach how attempts to reduce dimension
of the dataset to ease representation and interpretation (we called the
approach representation learning) The main issue with the high
dimensional dataset is more difficult it becomes to sample the space.
This causes many problems. Algorithms that operate on high-dimensional
data tend to have a very high time complexity. Many machine learning
algorithms, for example, struggle with high-dimensional data. In this
article the authors speak about principale manifolds and nonlinear
dimention reduction via local tangent space alignmentso what's what's
local tangent aligment ??and why we use it?
~ ~ ~ ~ ~ ~ LTSA is technique that describes local properties of the
high-dimensional data using the local tangent space of each datapoint .
LTSA is based on the observation that, if local linearity of the
manifold is assumed, there exists a linear mapping from a
high-dimensional datapoint to its local tangent space, and that there
exists a linear mapping from the corresponding low-dimensional data
point to the same local tangent space . LTSA attempts to align these
linear mappings in such a way, that they construct the local tangent
space of the manifold from the low-dimensional representation. In other
words, LTSA simultaneously searches for the coordinates of the
low-dimensional data representations, and for the linear mappings of the
low dimensional datapoints to the local tangent space of the
high-dimensional data.{\ref{391140}}
~ ~ ~ ~ all manifold learning algorithms can be applied to data
dimensionality reduction, producing a low-dimensional encoding from a
high-dimensional~one~ and~ among these algorithm LTSA .His principe is
to to use the tangent space in the neighborhood of a data point to
represent the local geometry, and then align those local tangent spaces
to construct the global coordinate system for the nonlinear manifold.
\section*{3-~Positioning~:}\label{positioning}
\par\null
The purpose of~ ~ LOCAL TANGENT SPACE ALIGNMENT ~and other
dimensionality reduction algorithms like Locally Linear Embedding (LLE)
), Principal Component Analysis (PCA) and Isomap, is to find
correlations and connections between different samples of an observed
object and plotting them into a drawn that can be visualized. Also it is
able to maps the resource data into a single global coordinate system,
optimizing it not using local minima and learning form nonlinear
manifolds as it is shown in the source article~ ~ .
~ ~ ~Several different algorithms have been developed to dimensionality
reduction so he~ Produce a compact low-dimensional encoding from
high-dimensiona , While all of these methods have a similar goal,
approaches to the problem are different.
~ ~ ~ we have linear and nolinear dimensionality we have a several
methods use to dimentionality reduction for exemple :
\textbf{a-Principal components analysis (PCA)}
~ ~ ~ ~Is a very popular technique for dimensionality reduction. PCA
aims to find a linear subspace of dimension ``d'' lower than ``n'' such
that the data points lie mainly on this linear subspace.\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=1.00\columnwidth]{figures/image1/1}
\caption{{exemple of PCA.
{\label{div-393928}}%
}}
\end{center}
\end{figure}
\textbf{b-Locally Linear Embedding~:} is approach which address the
problem of nonlinear dimensionality reduction by computing low
dimensional, neighborhoods preserving embedding of high dimensional
data. ``''so his aims is to find a mapping to preserve local linear
relationship between neighbors. And he preserve the local regression
weights that reconstruct the original data with the new data
.{\ref{contribution}}``''.
Minimizing the cost function taken into account that each
sample~~is reconstructed using its neighbors we obtain
the weights for each case as a matrix, knowing that the sum of all
weights of the same row must be equals to 1. So in the final step we can
compute the cost function, basing on locally linear reconstruction
errors, using the low dimensional vector~~obtained
from the corresponding~~that represents the global
coordinates of the manifold ~\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=1.00\columnwidth]{figures/image2/2}
\caption{{.exemple of LLE
{\label{div-121230}}%
}}
\end{center}
\end{figure}
1-compute the airghbors of each data point.
2-compute the weihts Wi,j that best reconstruct each data point \emph{X}
from its neighbor.
3-compute the vector \emph{Y} best reconstructed by the weights Wi,j .
{[}4{]}~
~ ~ ~ ~ ~ ~ ~ \textbf{}And we have other~ dimention reduction such us :
\begin{itemize}
\tightlist
\item
\textbf{isomap}
\item
\textbf{Hessian Eigenmappin}
\item
\textbf{Multi-dimensional Scaling}
\end{itemize}
~
~ ~ ~ ~ ~ Now I will speak for a use local tangent space for nonlinear
dimention reduction .~ So~ what's local tangent~ ??
\section*{}\label{section}\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=0.70\columnwidth]{figures/3/3}
\caption{{\textsuperscript{\textbf{a pictorial representation of the tangent space
of single point X on sphere}}~
{\label{143254}}%
}}
\end{center}
\end{figure}
\section*{~ ~~}
{\label{505173}}
The local tangent space provides a low-dimensional linear approximation
of the local geometric structure of the nonlinear manifold. Those local
tangent coordinates will be aligned in the low dimensional space by
different local affine transformations to obtain a global coordinate
system.{\ref{143254}} ~
~ ~ ~c\textbf{/-}~\textbf{LTSA}~ ~: \textbf{LOCAL TANGENT SPACE
ALIGNMENT}
\section*{~}
{\label{991705}}
~ ~ ~ ~ ~ ~is a technique that describes local properties of the
high-dimensional data using the local tangent space of each datapoint .
LTSA is~ ~algorithmically similar to~ LLe can be put in this category.
One advantage of LTSA over LLE is that using LTSA we can potentially
detect the intrinsic dimension of the underlying manifold by analyzing
the local tangent space structure.{\ref{391140}} ~
The LTSA~ algorithm comprises three stages:
\begin{enumerate}
\tightlist
\item
\textbf{~ ~ ~Nearest Neighbors Search} . Determine k nearest neighbors
xij of xi, j = 1, . . . , k.
\item
\textbf{Weight Matrix Construction(} Constructing alingment matrix).
Form the matrix B by locally summing (7.18) in the article if a direct
eigen-solver will be used. Otherwise implement a routine that computes
matrix-vector multiplication Bu for an arbitrary vector u.
\item
\textbf{Aligning global coordinates} Compute the d + 1 smallest
eigenvectors of B and pick up the eigenvector matrix {[}u2, \selectlanguage{ngerman}· · · ,
ud+1{]} corresponding to the 2nd to d+1st smallest eigenvalues, and
set~ T = {[}u2, · · · , ud+1{]}T .
\end{enumerate}
for using LTSA in matlab we need to use function
function mappedX = ltsa(X, no\_dims, k, eig\_impl) with~
The function runs the local tangent space alignment algorithm on
dataset~ X, reducing the data to dimensionality d. The number of
neighbors is\texttt{}specified by k.
\par\null\par\null
\section*{4 - Contribution~:}\label{contribution}
In this paper~ the authors propose an algorithme to solve~ this problem
via local tangent space alignement(LTSA).it use it for nonlinear
dimension reduction.
compared with other dimention reduction is to reduce the computation
complexity of the algorithm and improve the performance and precision~ ~
.
\section*{5-Expériences:}\label{expuxe9riences}
In this parte i will presente somme experience of LTSA so :
1/- in the first experience:
f(\selectlanguage{greek}τ\selectlanguage{english}) = 3\selectlanguage{greek}τ \selectlanguage{english}3 + 2\selectlanguage{greek}τ \selectlanguage{english}2 - 2\selectlanguage{greek}τ, τ \selectlanguage{english}[?]\selectlanguage{english} {[}-1.1, 1{]} , The data set is generated
by adding noise in a relative fashion :
xi= xi = f(\selectlanguage{greek}τ\selectlanguage{english}i*) (2+ \selectlanguage{greek}η \selectlanguage{english}* ei )~ ~ ~with~ ~with normally distributed ei~ ~
~ ~ ~ ~.
we~ increase the noise levls from~ \selectlanguage{greek}η \selectlanguage{english}=0.01,~ ~ 0.03 and 0.05 for three
colum a, b , and arespectively.and we use same number of neighbors k=10.
for colum d~~ \selectlanguage{greek}η \selectlanguage{english}=0.05~ and~ ~k=20.
\par\null\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=1.00\columnwidth]{figures/4/7}
\caption{{~ manifolds with different noise levels (top) and computed coordinates
\selectlanguage{greek}τ\selectlanguage{english}i vs. exact \selectlanguage{greek}τ\selectlanguage{english}i*(bottom)
{\label{624404}}%
}}
\end{center}
\end{figure}
~ ~we observe~ with the increasing noise level \selectlanguage{greek}η, \selectlanguage{english}thecomputed \selectlanguage{greek}τ\selectlanguage{english}i's get
expressed at points with relatively large noise.and we compare between~
colum c, and d that they have same noise levels~ \selectlanguage{greek}η\selectlanguage{english}~ but deferent~ ~
number of neighbors~ ~k~ ~we obsereve~ the quality of thecomputed \selectlanguage{greek}τ\selectlanguage{english}i's
improved~ ~ ~ ~ in hight nombre~ neighbors.~ In general, k~ should~ be~
chosen to match the sampling density ,noise level and the curvature at
each data points so as to extract an accurate local tangent space.It is
therefore worthy of considering variable number of neighbors that are
adaptively chosen at each data point.
\par\null
\textbf{2 nd experience:~}
in this test the authors compare between LTSA and LLE with applieing~
both LTSA and LLE to the S-curve data set (total data points =2000
uniformly sampled without noise) with different number of neighbors. For
d = 2,and k which is chosen from k = 6 to k = 30, LTSA always produces
coordinates Tthat has similar geometric structure as the generating
coordinates.
they show~ the figures :
\par\null\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=1.00\columnwidth]{figures/5/7}
\caption{{Computed 2D coordinates of the S-curve by LLE with various neighborhood
size k.
{\label{481013}}%
}}
\end{center}
\end{figure}\selectlanguage{english}
\begin{figure}[h!]
\begin{center}
\includegraphics[width=0.70\columnwidth]{figures/8/8}
\caption{{Computed coordinates of the S-curve by LTSA with various neighborhood
size k
{\label{807215}}%
}}
\end{center}
\end{figure}
\section*{}
{\label{912480}}
~ ~ ~ ~we obsereve~ deformations in the generated coordinates are quite
prominen~ ~ ~when we change various neighbohood size k
in~\textbf{fig.5~}corresponding~ computed coordinates~ ~ by LLE , and we
observe~ produces coordinates T that almost has similar geometric
structure as the generating coordinates~ ~ ~ ~when we change various
neighbohood size k in~\textbf{fig.6~}corresponding computes coordinates
by LTSA.
As we can observe, the advantage of LTSA over LLE which is that using
LTSA.LTSA algorithm seems to be less sensitive to the choice of k than
LLE.~~
\section*{6- Conclusion}
~ ~ In this analize of this article , we tried to analyze~ the algorithm
of LTSA and to compare it with the existing solution like PCA and LLE ,
with observetion of the techniques that the others have used which is
the construction of local tangent spaces to represent local geometry and
the global alignment of the local tangent spaces . and i try to explain~
how~ it use~ ~ ~nonlinear dimension reduction~ ~ ~ ~via local tangent
space alignement(LTSA). ~
\par\null
\section*{references :~ ~}
{\label{912480}}
\subsection*{\texorpdfstring{~\url{https://lvdmaaten.github.io/publications/papers/TR\_Dimensionality\_Reduction\_Review\_2009.pdf}
(1)
j}{~https://lvdmaaten.github.io/publications/papers/TR\_Dimensionality\_Reduction\_Review\_2009.pdf (1) j}}
{\label{410624}}
~\href{https://www.stat.washington.edu/wxs/Learning-papers/principal-curves.pdf}{\textbf{https://www.stat.washington.edu/wxs/Learning-papers/principal-curves.pdf}}\textbf{(2)~}
\subsection*{\texorpdfstring{\emph{\href{http://scikit-learn.org/stable/modules/manifold.html\#local-tangent-space-alignment}{\textbf{http://scikit-learn.org/stable/modules/manifold.html\#local-tangent-space-alignment}}}~(3)}{http://scikit-learn.org/stable/modules/manifold.html\#local-tangent-space-alignment~(3)}}
{\label{703839}}
\href{http://ieeexplore.ieee.org/document/1039206/}{\textbf{http://ieeexplore.ieee.org/document/1039206/}}\textbf{(4)}
\textbf{}\href{http://www.robots.ox.ac.uk/~az/lectures/ml/lle.pdf}{\textbf{http://www.robots.ox.ac.uk/\textasciitilde{}az/lectures/ml/lle.pdf}}\textbf{~(5)}
\selectlanguage{english}
\FloatBarrier
\end{document}