Finally arrive at reproducing kernel Hilbert space. https://nzer0.github.io/reproducing-kernel-hilbert-space.html
The above post introduces RKHS in Korean. It was helpful. I had struggled to understand some concepts in RKHS. What does mean Hilbert space in terms of feature expansion? (, ) It was confusing the difference between and . means the function in Hilbert space and is evaluation.
I thought that the function can be represented by the inner product of the basis of feature space and coefficients , and the coefficients are vectors in feature space.
The reproducing property of Kernel is . Thus . is a specified function in Hilbert space and an evaluator of the specific point x. This means the inner product of and is the value of at point , .
In a nutshell, kenel method is a different way of evaluating f in a specific point . Evaluating a function at a point is inner product of and , where is a evaluation functional which is a kernal function and linear . Reproducing property of can be achieved if all has bounded evaluation functionals ().
In least square methods, the parameters () are determined by inner product of . In Kernel method, is determined . Each is a parameter and a argument (variable like ).
Some subclass of the loss function and penalty functions can be generated by a positive definite kernel. A Kernel accepts two arguments and a Kernel function does one argument and the other argument becomes parameter. Reproducing Kernel Hilbert space is a function space with Kernal function space with the evaluation functional as a Kernel. The feature expansion into the RKHS can use the Kernel matrix instead of the inner product of each variable .
The important concepts are Hilbert space, inner product, Kernel function, evaluation functional, feature expansion, Fourier transformation, Reisz representation theorem (dual space of Hibert space )