ITAT 2016 Proceedings, CEUR Workshop Proceedings Vol. 1649, pp. 118–122 http://ceur-ws.org/Vol-1649, Series ISSN 1613-0073, c 2016 V. K˚urková Multivariable Approximation by Convolutional Kernel Networks Veraˇ K˚urková Institute of Computer Science, Academy of Sciences of the Czech,
[email protected], WWW home page: http://www.cs.cas.cz/ vera ∼ Abstract: Computational units induced by convolutional widths parameters) benefit from geometrical properties of kernels together with biologically inspired perceptrons be- reproducing kernel Hilbert spaces (RKHS) generated by long to the most widespread types of units used in neu- these kernels. These properties allow an extension of rocomputing. Radial convolutional kernels with vary- the maximal margin classification from finite dimensional ing widths form RBF (radial-basis-function) networks and spaces also to sets of data which are not linearly separable these kernels with fixed widths are used in the SVM (sup- by embedding them into infinite dimensional spaces [2]. port vector machine) algorithm. We investigate suitabil- Moreover, symmetric positive semidefinite kernels gener- ity of various convolutional kernel units for function ap- ate stabilizers in the form of norms on RKHSs suitable for proximation. We show that properties of Fourier trans- modeling generalization in terms of regularization [6] and forms of convolutional kernels determine whether sets of enable characterizations of theoretically optimal solutions input-output functions of networks with kernel units are of learning tasks [3, 19, 11]. large enough to be universal approximators. We com- Arguments proving the universal approximation prop- pare these properties with conditions guaranteeing positive erty of RBF networks using sequences of scaled kernels semidefinitness of convolutional kernels.