It is the main subject of analysis that finding conditions making sequential mathematical objects like a set, sequence, series to be convergent. Induction changes \(S = \mathbb{N}\) to \(s_{1} \in S\) and if \(s_{n} \in S\) then \(s_{n+1} \in S\). The natural number has a property of endless addable with one. But, induction can prove only natural number \(\mathbb {N}\) not infinity \(\infty\). \[ Induction \\ s_{1} \in S \\ if\ s_{n} \in S \ then \ s_{n+1} \in S \\ Then\ S = \mathbb{N} \\ \] The limit is the way \(\mathbb {N}\) goes to \(\infty\).

Lagrange dual problem and conjugate function

The optimization problem have two components that are objective function \(f_0 : \mathbb R ^n \rightarrow \mathbb R\) and the constraints. The objective function and constraints keep in check each other and make balance at saddle point i.e. optimal point. The dual (Lagrange) problem of the optimal problem also solve the optimization problem by making low boundary. The dual problem can be explained as a conjugate function \(f^* = \sup (x^Ty-f(x))\).

Limit of inequality of sequence and epsilon

Here I summarize some tools for proof of the Riesz representation theorem. They are the limit of inequality of sequence and \(\epsilon\). The Rudin’s proof of the Riesz representation theorem construct measure \(\mu\) and measurable set \(\mathfrak{M}\), then prove the \(\mu\) and \(\mathfrak{M}\) have properties. Countable additivity (not subadditivity) is an important property. The strategy of proving equality (additivity) is bidirectional inequality. Limit of inequality of sequence gives us a tool that finite inequality makes infinite inequality.


A sequence can be defined as a function on the domain of natural number like \(1, 1/2, 1/3 ... 1/n\). This sequence approach to the 0, but never touch the 0. However, people can not take their desire to link the sequence and the 0. Because \(\infty\) is not a member of the natural number even real number, another concept is necessary to link the sequence and the 0. It is the limit.


The reproducing kernel hilbert space (RKHS) was my motivation to study analysis. The hilbert space is a orthogonal normed vector space. I still do not know about the meaning of “reproducing kernal”. The RKHS appeared in the book titled An Introduction to Statisitical Learning written by Hastie. I began to google the meaning of the spaces such as the Hilbert, Banarch. I decided to read the Understaing Analysis written by Abbott.