Journal of Machine Learning Research 17 (2016) 1-48 Submitted 11/15; Revised 7/16; Published 9/16 Input Output Kernel Regression: Supervised and Semi-Supervised Structured Output Prediction with Operator-Valued Kernels C´elineBrouard
[email protected] Helsinki Institute for Information Technology HIIT Department of Computer Science, Aalto University 02150 Espoo, Finland IBISC, Universit´ed'Evry´ Val d'Essonne 91037 Evry´ cedex, France Marie Szafranski
[email protected] ENSIIE & LaMME, Universit´ed'Evry´ Val d'Essonne, CNRS, INRA 91037 Evry´ cedex, France IBISC, Universit´ed'Evry´ Val d'Essonne 91037 Evry´ cedex, France Florence d'Alch´e-Buc
[email protected] LTCI, CNRS, T´el´ecom ParisTech Universit´eParis-Saclay 46, rue Barrault 75013 Paris, France IBISC, Universit´ed'Evry´ Val d'Essonne 91037 Evry´ cedex, France Editor: Koji Tsuda Abstract In this paper, we introduce a novel approach, called Input Output Kernel Regression (IOKR), for learning mappings between structured inputs and structured outputs. The approach belongs to the family of Output Kernel Regression methods devoted to regres- sion in feature space endowed with some output kernel. In order to take into account structure in input data and benefit from kernels in the input space as well, we use the Re- producing Kernel Hilbert Space theory for vector-valued functions. We first recall the ridge solution for supervised learning and then study the regularized hinge loss-based solution used in Maximum Margin Regression. Both models are also developed in the context of semi-supervised setting. In addition we derive an extension of Generalized Cross Validation for model selection in the case of the least-square model.