The Entropy-Regularized Wasserstein Distance As a Metric for Machine Learning Based Post-Processing of Structural MR Images of the Brain
Total Page:16
File Type:pdf, Size:1020Kb
The Entropy-Regularized Wasserstein Distance as a Metric for Machine Learning Based Post-Processing of Structural MR Images of the Brain Master thesis for attainment of the academic degree of Master of Science Westfälische Wilhelms-Universität Münster Department of Mathematics und Informatics Institute for Applied Mathematics Submitted by: Juliane Braunsmann Thesis supervised by: Prof. Dr. Benedikt Wirth Prof. Dr. Xiaoyi Jiang PD Dr. Tim Hahn Münster, 30th September 2018 Abstract This thesis treats the Wasserstein distance and its applicability as a metric between magnetic resonance images of the brain, represented as densities on a voxel grid. We aim to use the Wasserstein distance for dimensionality reduction of MR images by using it as a loss function in autoencoders. This requires a computationally feasible approach to calculate a large amount of distances, which can further be incorporated into machine learning frameworks. For this reason, we consider entropy regularization. We introduce this regularization for probability measures on compact metric spaces and explain why it suits these requirements. We further introduce an extension of the Wasserstein distance to measures with arbitrary mass, to which the same regularization approach can be applied. To evaluate the suitability of the Wasserstein distance to MR images, we incorporate it in kernel support vector machines and kernel principal com- ponent analysis to predict gender and depression, using a dataset provided by the Department of Psychiatry. Finally, we show that the Wasserstein distance can indeed be used to train au- toencoders. However, we also address the question whether the application of the Wasserstein distance as a loss function for comparing MR images is advisable in practice. I Declaration of Academic Integrity I hereby confirm that this thesis on “The Entropy-Regularized Wasserstein Distance as a Metric for Machine Learning Based Post-Processing of Structural MR Images of the Brain” is solely my own work and that I have used no sources or aids other than the ones stated. All passages in my thesis for which other sources, including electronic media, have been used, be it direct quotes or content references, have been acknowledged as such and the sources cited. (date and signature of student) I agree to have my thesis checked in order to rule out potential similarities with other works and to have my thesis stored in a database for this purpose. (date and signature of student) III Acknowledgements I would like to express my gratitude to everyone who made this thesis possible through their support: • Prof. Dr. Benedikt Wirth for taking time to help me work out the mathematical problems that arose during the process of writing this thesis, for answering all my questions and for supporting my pursuit of a dual master’s degree in mathematics and computer science. • Prof. Dr. Xiaoyi Jiang for offering me an interesting and challenging topic that combines mathematics and computer science and for establishing the cooperation with the depart- ment of psychiatry and for making time for group meetings. • PD Dr. Tim Hahn for introducing me to the topic of machine learning in psychiatry, for many interesting discussions, for offering me a space to work and for making the compu- tational resources of the department and the MRI data available to me. • The AI and Machine Learning in Psychiatry Group for helping me get started and for being pleasant office co-workers, always willing to answer my questions. I would like toespe- cially thank Claas Kähler for always being approachable and Kelvin Sarink for being good at his job of looking after and fixing technical things. • Dr. Bernhard Schmitzer for giving me advice regarding the implementation and derivation of the algorithm and for providing results obtained from his personal implementation of the algorithm for comparison. • My parents for their support throughout my studies and especially during the last months of my thesis. • Last but not least, I would like to thank my friends in Münster for always having a sym- pathetic ear, cheering me up and providing me with distraction when I struggled and of course for proofreading my thesis. V Contents Introduction 1 1 Machine Learning and Neural Networks 5 1.1 Feedforward Neural Networks . 5 1.2 Autoencoders . 9 1.3 Support Vector Machines . 12 1.4 Principal Component Analysis . 17 2 Mathematical Background 23 2.1 The Space of Measures and its Dual . 23 2.2 Convex Analysis and Nonlinear Optimization . 27 3 A Metric between Measures based on Optimal Transport Theory 31 3.1 The Kantorovich Problem and the Wasserstein Distance . 31 3.2 An Extension of the Wasserstein Distance for Arbitrary Measures . 36 3.2.1 The Kantorovich Extension . 37 3.2.2 The Unbalanced Mass Transportation Problem . 39 3.2.3 Connection between UMTP and K-Norm . 44 3.3 The Entropy-Regularized Kantorovich Problem . 52 3.3.1 Introduction of the Entropy-Regularized Kantorovich Problem . 52 3.3.2 A Dual Formulation . 62 3.3.3 Solving the Regularized Kantorovich Problem using Alternate Projections 69 3.4 The Entropy-Regularized Wasserstein Distance and its Properties . 71 4 Application to MR images 75 4.1 A Summary of the Results for Finite Spaces . 75 4.2 The Setting of MR Images and Numerical Improvements . 79 5 Connecting the Wasserstein Distance and Machine Learning 83 5.1 Using the Wasserstein Distance for the Construction of Kernels . 83 5.2 Using the Wasserstein Distance as a Loss Function . 87 6 Implementation 89 6.1 Libraries . 89 6.2 Implementation of the Wasserstein Distance Algorithm . 91 6.3 Implementation of SVM, PCA and Autoencoder . 95 6.3.1 SVM and PCA . 95 6.3.2 Autoencoder . 97 VII VIII Contents 7 Experiments 99 7.1 Description of the Data Set . 99 7.2 Evaluation of the Wasserstein Distance as a Metric Between MR Images . 100 7.2.1 Calculating the Wasserstein Distance with Different Parameters . 100 7.2.2 Classification . 105 7.2.3 Summary . 109 7.3 Using the Wasserstein Distance as a Loss Function in Autoencoders . 110 7.3.1 Fully Connected Autoencoder for Amygdala . 110 7.3.2 Convolutional Autoencoder for Amygdala . 112 7.3.3 Convolutional Autoencoder and Fully Connected Autoencoder for Hippo- campus . 112 7.3.4 Summary . 113 8 Conclusion and Outlook 119 Bibliography 123 1 Introduction The human brain is one of the most important organs of the human body and there has been much research into neuroimaging techniques such as Magnetic Resonance Imaging (MRI), which simplifies the (early) diagnosis of conditions such as brain tumors and aneurysms aswell chronic conditions such as multiple sclerosis. The obtained MR images are subsequently ana- lyzed by a specifically trained professional and a report is sent to the treating physician. Professionals are however not yet able to use neuroimaging techniques to diagnose a very com- mon condition of the brain, which is one of the leading causes of disability worldwide: depres- sion. It can affect people independently of their age, gender, wealth and location and has alarge impact on the life of the affected individual as well as on their family and friends. The worst consequence of depression is suicide, which makes depression a disease with significant mor- tality. Despite depression being such a major issue, a reliable diagnosis of depression as well as the identification of a suitable treatment for each individual case remain difficult. Todate, depression is diagnosed on the basis of behavioral symptoms, which means that the diagnosis depends on the patient’s cooperation and perception. This makes the distinction between major depressive disorder and bipolar disorder an especially hard challenge: while bipolar patients pass through manic as well as depressive episodes, they only perceive depressive episodes as un- usual. Time limitations and patient denial are additional factors that complicate diagnosis. For these reasons, the interest in a diagnosis based on neurobiological markers has grown in recent years. Using high-resolution structural magnetic resonance imaging, researchers have found that struc- tural changes are present in the brains of patients with depression. These findings motivate the quest for an automated system that could assist medical specialists in their decision-making. Such a system could be acquired by making use of machine learning models. Some research has been carried out in the direction of supervised learning, the dominant classification method being support vector machines (SVMs). Since the introduction of SVMs in the 1960s, signific- ant progress has been made in the domain of supervised learning with neural networks (NNs) and convolutional neural networks (CNNs). While they had been introduced not much later than SVMs, their usage only became feasible recently thanks to advanced hardware, notably high per- formance graphics processing units (GPUs) and software developed for the use with GPUs, like Tensorflow. While CNNs have been very successful for example in object classification, there have been no comparable successes in the field of psychiatry. This has various reasons, oneof them being the lack of large labeled datasets. As a reference, the largest database of labeled images, ImageNet, consists of over 14 million images, while datasets acquired from psychiatric studies usually only consist of less than one hundred subjects. The data set made available for this thesis by the Department of Psychiatry consists of almost two thousand labeled MR images, which is still not much compared to ImageNet, the data set is bound to grow further in the following years. 2 Contents Much of the success of deep learning in the field of computer vision can be attributed tothe fact that it renders the manual selection of suitable features for the task at hand unnecessary. Instead, “raw” images are fed into the network and suitable features are extracted automatically. In order to exploit this advantage for classification of depression, raw MR data has to be fed into the network.