4 Kernel Functions and Reproducing Kernel Hilbert Spaces

Whereas Chapter 3 gave a (moderately deep) introduction to the signal processing concepts to be used in this book, in this chapter we put together fundamental and advanced relevant concepts on Mercer’s kernels and reproducing kernel Hilbert spaces (RKHSs). The fundamental building block of the kernel learning theory is the kernel function, which provides an elegant framework to compare complex and nontrivial objects. After its introduction, we review the concept of an RKHS, and state the representer theorem. Then we study the main properties on kernel functions and their construction, as well as the basic ideas to work with complex objects and reproducing spaces. The support vector regression (SVR) algorithm is also introduced in detail, as it will be widely used and modified in Part II for building many of the DSP algorithms with kernel methods therein. We end up the chapter with some synthetic examples illustrating the concepts and tools presented.

4.1 Introduction

Kernel methods build upon the notion of kernel functions and RKHSs. Roughly speaking, a Mercer kernel in a Hilbert space is a function that computes the inner product between two vectors embedded in that space. These vectors are maps of vectors in an Euclidean space, where the mapping function can be nonlinear. This informal definition will be formalized through Mercer’s theorem, which gives the analytical power to kernel methods.

We will see that ...

Get Digital Signal Processing with Kernel Methods now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.