Statistical and Machine Learning Approaches for Network Analysis
by Matthias Dehmer, Subhash C. Basak
8.2 Convolution Kernels
Many kernels for structured data, including, but not limited to graphs, are based on the idea of convolution kernels introduced by Haussler in 1999 [62].3
8.2.1 Definition
Assume that a sample
can be decomposed into parts
, for example, a decomposition of a graph into subgraphs. Here,
are nonempty, separable metric spaces. The relation R indicates possible decompositions, where R(x, x1, . . ., xd) is true if and only if x can be decomposed into x1, . . ., xd. Given positive definite kernels
, 1 ≤ i ≤ d, the convolution kernel
is positive definite for finite R[62]. The sum runs over all decompositions of x and x ' into d parts for which R is true; if a sample cannot be decomposed, the sum is zero. Random walk kernels, path-based kernels, tree kernels, and cyclic pattern kernels are convolution kernels.
8.2.2 Variants and Extensions
Vishwanathan et al. [64] point out that “there have been a few attempts to extend R-convolution kernels to abstract semirings.” ...