6Reproducing Kernel Hilbert Space Models for Signal Processing

6.1 Introduction

In this chapter we introduce a set of signal models properly defined in an RKHS. The set of processing algorithms presented here are collectively termed an RSM because they share a distinct feature; namely, all of them intrinsically implement a particular signal model (like those previously described in Chapter 2) whose equations are written in the RKHS generated with kernels. For doing that, we exploit the well‐known kernel trick (Schölkopf and Smola, 2002), and this is the most usual approach to kernel‐based signal processing problems in the literature. Quite often one is interested in describing the signal characteristics by using a specific signal model whose performance is hampered by the (commonly strong) assumption of linearity. As we already know, the use of the theory of reproducing kernels can circumvent this problem by defining nonlinear algorithms by simply replacing dot products in the feature space by an appropriate Mercer kernel function (Figure 6.1). The most famous example of this kind of approaches is the support vector classification (SVC) algorithm, which has yielded a vast number of applications in the field of signal processing, from speech recognition to image classification.

yn-2 vs. yn-1 (top) and xn-2 vs. xn-1 (bottom), displaying an ascending line with scattered dots, respectively.

Figure 6.1 In SVM estimation problems, a nonlinear relationship between data points in the input space ...

Get Digital Signal Processing with Kernel Methods now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.