Personalized Automatic Estimation of Self-reported Pain Intensity from Facial Expressions Daniel Lopez Martinez1;2, Ognjen (Oggi) Rudovic2 and Rosalind Picard2 1 Harvard-MIT Division of Health Sciences and Technology, Massachusetts Institute of Technology 2 MIT Media Lab, Massachusetts Institute of Technology, Cambridge, MA, USA fdlmocdm,orudovic,
[email protected] Abstract of pain, patient’s self-report provides the most valid mea- sure of this fundamentally subjective experience [28]. Var- Pain is a personal, subjective experience that is com- ious pain rating scales have been developed to capture pa- monly evaluated through visual analog scales (VAS). While tient’s self-report of pain intensity [16, 14]. The Visual Ana- this is often convenient and useful, automatic pain detec- logue Scale (VAS), Numerical Rating Scale (NRS), Verbal tion systems can reduce pain score acquisition efforts in Rating Scale (VRS), and Faces Pain Scale-Revised (FPS- large-scale studies by estimating it directly from the partic- R) are among the most commonly used scales. While evi- ipants’ facial expressions. In this paper, we propose a novel dence supports the reliability and validity of each of these two-stage learning approach for VAS estimation: first, our measures across many populations [11], each measure has algorithm employs Recurrent Neural Networks (RNNs) to strengths and weaknesses that make them more appropri- automatically estimate Prkachin and Solomon Pain Inten- ate for different applications [49]. In the research setting, sity (PSPI) levels from face images. The estimated scores for example, VAS is usually preferred since it is statistically are then fed into the personalized Hidden Conditional Ran- the most robust as it can provide ratio level data [49, 32], dom Fields (HCRFs), used to estimate the VAS, provided by allowing parametric tests to be used in statistical analysis.