Reproducing kernel hilbert space

Dimension Reduction

Spectral decomposition

Gaussian kernel matrix can be factorized into \((\Phi \textbf{X})^\textbf{H} \Phi \textbf{X} =\textbf{X}^\textbf{H} \Phi^\textbf{H} \Phi \textbf{X} = \textbf{X}^\textbf{H}\textbf{X}\), where \(\Phi\) is Gaussian kernel basis matrix and \(\textbf{X}\) is coefficients matrix of reproducing kernel Hilbert space \(K(\cdot,x) \in \mathcal{H}_K\) https://www.jkangpathology.com/post/reproducing-kernel-hilbert-space/. A matrix is a system. A system takes input and gives output. A matrix is a linear system. Differentiation and Integration are linear systems. Fourier transformation matches input basis and operator (differentiation) basis.

Reproducing Kernel Hilbert Space

Finally arrive at reproducing kernel Hilbert space. https://nzer0.github.io/reproducing-kernel-hilbert-space.html The above post introduces RKHS in Korean. It was helpful. I had struggled to understand some concepts in RKHS. What does mean Hilbert space in terms of feature expansion? (\(f:\mathcal{X} \to \mathbb{R}\), \(f \in \mathcal{H}_K\)) It was confusing the difference between \(f\) and \(f(x)\). \(f\) means the function in Hilbert space and \(f(x)\) is evaluation. I thought that the function can be represented by the inner product of the basis of feature space \(K(\cdot,x)\) and coefficients \(f\), and the coefficients are vectors in feature space.

Analysis

The reproducing kernel hilbert space (RKHS) was my motivation to study analysis. The hilbert space is a orthogonal normed vector space. I still do not know about the meaning of “reproducing kernal”. The RKHS appeared in the book titled An Introduction to Statisitical Learning written by Hastie. I began to google the meaning of the spaces such as the Hilbert, Banarch. I decided to read the Understaing Analysis written by Abbott.