Ce Document Est Le Fruit D'un Long Travail Approuvé Par Le Jury De Soutenance Et Mis À Disposition De L'ensemble De La Communauté Universitaire Élargie

Total Page:16

File Type:pdf, Size:1020Kb

Ce Document Est Le Fruit D'un Long Travail Approuvé Par Le Jury De Soutenance Et Mis À Disposition De L'ensemble De La Communauté Universitaire Élargie AVERTISSEMENT Ce document est le fruit d'un long travail approuvé par le jury de soutenance et mis à disposition de l'ensemble de la communauté universitaire élargie. Il est soumis à la propriété intellectuelle de l'auteur. Ceci implique une obligation de citation et de référencement lors de l’utilisation de ce document. D'autre part, toute contrefaçon, plagiat, reproduction illicite encourt une poursuite pénale. Contact : [email protected] LIENS Code de la Propriété Intellectuelle. articles L 122. 4 Code de la Propriété Intellectuelle. articles L 335.2- L 335.10 http://www.cfcopies.com/V2/leg/leg_droi.php http://www.culture.gouv.fr/culture/infos-pratiques/droits/protection.htm THESE` pr´esent´eepar NGUYEN Manh Cuong en vue de l'obtention du grade de DOCTEUR DE L'UNIVERSITE´ DE LORRAINE (arr^et´eminist´eriel du 7 Ao^ut 2006) Sp´ecialit´e: INFORMATIQUE LA PROGRAMMATION DC ET DCA POUR CERTAINES CLASSES DE PROBLEMES` EN APPRENTISSAGE ET FOUILLE DE DONEES.´ Soutenue le 19 mai 2014 devant le jury compos´ede Rapporteur BENNANI Youn`es Professeur, Universit´eParis 13 Rapporteur RAKOTOMAMONJY Alain Professeur, Universit´ede Rouen Examinateur GUERMEUR Yann Directeur de recherche, LORIA-Nancy Examinateur HEIN Matthias Professeur, Universit´eSaarland Examinateur PHAM DINH Tao Professeur ´em´erite, INSA-Rouen Directrice de th`ese LE THI Hoai An Professeur, Universit´ede Lorraine Co-encadrant CONAN-GUEZ Brieuc MCF, Universit´ede Lorraine These` prepar´ ee´ a` l'Universite´ de Lorraine, Metz, France au sein de laboratoire LITA Remerciements Cette th`ese a ´et´epr´epar´ee au sein du Laboratoire d'Informatique Th´eorique et Appliqu´ee (LITA) de l'Universit´ede Lorraine - France, sous la co-direction du Madame le Professeur LE THI Hoai An, Directrice du laboratoire Laboratoire d'Informatique Th´eorique et Ap- pliqu´ee(LITA), Universit´ede Lorraine et Ma^ıtre de conf´erence CONAN-GUEZ Brieuc, Institut Universitaire de Technologie (IUT), Universit´ede Lorraine, Metz. J'adresse toute ma gratitude `aMadame le Professeur LE THI Hoai An et Ma^ıtre de conf´erence CONAN-GUEZ Brieuc qui ont accept´ed'^etre mes directeurs de th`ese, pour leur aide, leur patience, leur soutien permanent, leurs conseils pr´ecieux, leurs encourage- ments et leur disponibilit´etout au long de ce travail. Je tiens `aremercier plus particuli`erement Monsieur le Professeur Pham Dinh Tao, Pro- fesseur `aL'INSA de Rouen pour ses conseils, son suivi de mes recherches. Je voudrais lui exprimer toutes mes reconnaissances pour sa sympathie et les discussions tr`esint´eressantes qu'il a men´ees pour me sugg´erer les voies de recherche. Je souhaite exprimer mes vifs remerciements `aMonsieur BENNANI Younes` , Professeur `al'Universit´eParis 13 et `aMonsieur RAKOTOMAMONJY Alain, Professeur `al'Universit´e de Rouen, de m'avoir fait l'honneur d'accepter la charge de rapporteur de ma th`ese, ainsi que d'avoir particip´e`ajuger mon travail. Je tiens `aremercier Monsieur GUERMEUR Yann, Directeur de Recherche, LORIA Nancy et Monsieur HEIN Matthias, Professeur `al'Universit´eSaarland d'avoir particip´e `ajuger mon travail. Je remercie particuli`erement le Docteur LE Hoai Minh pour l'aide inestimable, sa col- laboration et pour les discussions tr`esint´eressantes que nous avons pu avoir. Je remercie tous mes coll`egues Fran¸cais et Vietnamiens rencontr´es `aMetz ainsi qu'`aRouen pour les moments agr´eables lors de mon s´ejour en France. Je remercie particuli`erement tous mes coll`egues `aLITA, l'Universit´ede Lorraine, pour les partages dans le travail et dans la vie. Mes remerciements s'adressent ´egalement au Gouvernement Vietnamien qui a financ´emes ´etudes pendant la th`ese. Je ne devrais pas oublier remercier tout l'´equipe du personnel de l'Universit´ede l'Industrie de Hanoi pour son soutien. Je devrais t´emoigner mon affectation, mes reconnaissances `ama famille pour les sacrifices qu'elle a faites pour me soutenir lors des moments difficiles. Enfin `atous ceux qui m'ont soutenu de pr`es ou de loin et `atous ceux qui m'ont incit´em^eme involontairement `afaire mieux, veuillez trouver ici le t´emoignage de ma profonde gratitude. ii R´esum´e La classification (supervis´ee, non supervis´ee et semi-supervis´ee) est une th´ematique impor- tante de la fouille de donn´ees. Dans cette th`ese, nous nous concentrons sur le d´eveloppe- ment d'approches d'optimisation pour r´esoudre certains types des probl`emes issus de la classification de donn´ees. Premi`erement, nous avons examin´eet d´evelopp´edes algorithmes pour r´esoudre deux probl`emes classiques en apprentissage non supervis´ee : la maximisation du crit`ere de modularit´epour la d´etection de communaut´es dans des r´eseaux complexes et les cartes auto-organisatrices. Deuxi`emement, pour l'apprentissage semi-supervis´ee, nous proposons des algorithmes efficaces pour le probl`eme de s´election de variables en semi-supervis´eeMachines `avecteurs de support. Finalement, dans la derni`ere partie de la th`ese, nous consid´erons le probl`emede s´election de variables en Machines `avecteurs de support multi-classes. Tous ces probl`emes d'optimisation sont non convexe de tr`esgrande dimension en pratique. Les m´ethodes que nous proposons sont bas´ees sur les programmations DC (Difference of Convex functions) et DCA (DC Algorithms) ´etant reconnues comme des outils puis- sants d'optimisation. Les probl`emes ´evoqu´esont ´et´ereformul´escomme des probl`emes DC, afin de les r´esoudre par DCA. En outre, compte tenu de la structure des probl`emes consid´er´es, nous proposons diff´erentes d´ecompositions DC ainsi que diff´erentes strat´egies d'initialisation pour r´esoudre un m^eme probl`eme. Tous les algorithmes propos´es ont ´et´e test´essur des jeux de donn´ees r´eelles en biologie, r´eseaux sociaux et s´ecurit´einformatique. Abstract Classification (supervised, unsupervised and semi-supervised) is one of important research topics of data mining which has many applications in various fields. In this thesis, we focus on developing optimization approaches for solving some classes of optimization problems in data classification. Firstly, for unsupervised learning, we considered and developed the algorithms for two well-known problems: the modularity maximization for community detection in complex networks and the data visualization problem with Self-Organizing Maps. Secondly, for semi-supervised learning, we investigated the effective algorithms to solve the feature selection problem in semi-supervised Support Vector Machine. Finally, for supervised learning, we are interested in the feature selection problem in multi-class Support Vector Machine. All of these problems are large-scale non-convex optimization problems. Our methods are based on DC Programming and DCA which are well-known as power- ful tools in optimization. The considered problems were reformulated as the DC programs and then the DCA was used to obtain the solution. Also, taking into account the structure of considered problems, we can provide appropriate DC decompositions and the relevant choice strategy of initial points for DCA in order to improve its efficiency. All these pro- posed algorithms have been tested on the real-world datasets including biology, social networks and computer security. iii Curriculum Vitae PERSONAL INFORMATIONS First name NGUYEN Last name Manh Cuong Date and place of birth 18/3/1978 at Thai Binh, Vietnam Gender Male Nationality Vietnamese Personal address Résidence Universitaire du Saulcy, 57010 Metz, France Professional address LITA, University of Lorraine, Ile du Saulcy, 57045 Metz, France. Phone number 06 73 71 59 89 Email [email protected] EDUCATION University of Lorraine PhD. Student 2010-2014 Subject of the thesis: La programmation DC et DCA pour certaines classes de problèmes en Apprentissage et fouille de données. Hanoi University of Technology MSc. Student 2001-2003 Subject of the thesis: “Mixed Similarity Measure and Its Applications” Grade: Good Hanoi University of Science BSc. Student 1996-2000 Subject of the thesis: “The Graph Algorithms for the Problems of Traffic in the City” Grade: Good TEACHING EXPERIENCE Hanoi University of Industry, Vietnam • Lecturer 2003-2010 LANGUAGES • French • English • Vietnamese vi Curriculum Vitae Publications Refereed international journal papers [1]. Le Thi, H.A., Nguyen, M.C.: Efficient Algorithms for Feature Selection in Multi- class Support Vector Machine. Submitted. [2]. Le Hoai, M., Nguyen, M.C., Le Thi, H.A.: DCA Based algorithm for feature selection in Semi-Supervised Support Vector Machines. Submitted. [3]. Le Thi, H.A., Nguyen, M.C., Pham Dinh, T.: A DC programming approach for finding Communities in networks. Submitted version to the Neural Computation journal (NECO). [4]. Le Thi, H.A., Nguyen, M.C.: Self-Organizing Maps by Difference of Convex func- tions optimization. Revised version to the Data Mining and Knowledge Discovery journal (DAMI). [5]. Conan-Guez, B., Nguyen, M.C.: Mod-Mullner:¨ An Efficient Algorithm for Hier- archical Community Analysis in Large Networks. Submitted version to the International Journal of Social Network Mining (IJSNM). Refereed papers in books / Refereed international conference papers [1]. Phan, D.N., Nguyen, M.C., Le Thi, H.A.: A DC programming approach for sparse linear discriminant analysis. In Advances in Intelligent Systems and Computing, Volume 282, pp. 65-74, Springer (2014). [2]. Le Thi, H.A., Nguyen, M.C.: Efficient Algorithms for Feature Selection in Multi- class Support Vector Machine. In Advanced Computational Methods for Knowledge En- gineering, Studies in Computational Intelligence, Volume 479, pp. 41-52, Springer (2013). [3]. Le Hoai, M., Le Thi, H.A., Nguyen, M.C.: DCA Based algorithm for feature selection in Semi-Supervised Support Vector Machines. In Machine Learning and Data Mining in Pattern Recognition, Lecture Notes in Computer Science, Volume 7988, pp. 528-542, Springer (2013). [4]. Le, A.V., Le Thi, H.A., Nguyen, M.C.,Zidna, A.: Network Intrusion Detection based on Multi-Class Support Vector Machine.
Recommended publications
  • Lipschitz Continuity of Convex Functions
    Lipschitz Continuity of Convex Functions Bao Tran Nguyen∗ Pham Duy Khanh† November 13, 2019 Abstract We provide some necessary and sufficient conditions for a proper lower semi- continuous convex function, defined on a real Banach space, to be locally or globally Lipschitz continuous. Our criteria rely on the existence of a bounded selection of the subdifferential mapping and the intersections of the subdifferential mapping and the normal cone operator to the domain of the given function. Moreover, we also point out that the Lipschitz continuity of the given function on an open and bounded (not necessarily convex) set can be characterized via the existence of a bounded selection of the subdifferential mapping on the boundary of the given set and as a consequence it is equivalent to the local Lipschitz continuity at every point on the boundary of that set. Our results are applied to extend a Lipschitz and convex function to the whole space and to study the Lipschitz continuity of its Moreau envelope functions. Keywords Convex function, Lipschitz continuity, Calmness, Subdifferential, Normal cone, Moreau envelope function. Mathematics Subject Classification (2010) 26A16, 46N10, 52A41 1 Introduction arXiv:1911.04886v1 [math.FA] 12 Nov 2019 Lipschitz continuous and convex functions play a significant role in convex and nonsmooth analysis. It is well-known that if the domain of a proper lower semicontinuous convex function defined on a real Banach space has a nonempty interior then the function is continuous over the interior of its domain [3, Proposition 2.111] and as a consequence, it is subdifferentiable (its subdifferential is a nonempty set) and locally Lipschitz continuous at every point in the interior of its domain [3, Proposition 2.107].
    [Show full text]
  • NOTE on the CONTINUITY of M-CONVEX and L-CONVEX FUNCTIONS in CONTINUOUS VARIABLES 1. Introduction Two Kinds of Convexity Concept
    Journal of the Operations Research Society of Japan 2008, Vol. 51, No. 4, 265-273 NOTE ON THE CONTINUITY OF M-CONVEX AND L-CONVEX FUNCTIONS IN CONTINUOUS VARIABLES Kazuo Murota Akiyoshi Shioura University of Tokyo Tohoku University (Received March 27, 2008) Abstract M-convex and L-convex functions in continuous variables constitute subclasses of convex func- tions with nice combinatorial properties. In this note we give proofs of the fundamental facts that closed proper M-convex and L-convex functions are continuous on their effective domains. Keywords: Combinatorial optimization, convex function, continuity, submodular func- tion, matroid. 1. Introduction Two kinds of convexity concepts, called M-convexity and L-convexity, play primary roles in the theory of discrete convex analysis [6]. They are originally introduced for functions in integer variables by Murota [4, 5], and then for functions in continuous variables by Murota{ Shioura [8, 10]. M-convex and L-convex functions in continuous variables constitute subclasses of convex functions with additional combinatorial properties such as submodularity and diagonal dom- inance (see, e.g., [6{11]). Fundamental properties of M-convex and L-convex functions are investigated in [9], such as equivalent axioms, subgradients, directional derivatives, etc. Con- jugacy relationship between M-convex and L-convex functions under the Legendre-Fenchel transformation is shown in [10]. Subclasses of M-convex and L-convex functions are investi- gated in [8] (polyhedral M-convex and L-convex functions) and in [11] (quadratic M-convex and L-convex functions). As variants of M-convex and L-convex functions, the concepts of M\-convex and L\-convex functions are also introduced by Murota{Shioura [8, 10], where \M\" and \L\" should be read \M-natural" and \L-natural," respectively.
    [Show full text]
  • Chapter 5 Convex Optimization in Function Space 5.1 Foundations of Convex Analysis
    Chapter 5 Convex Optimization in Function Space 5.1 Foundations of Convex Analysis Let V be a vector space over lR and k ¢ k : V ! lR be a norm on V . We recall that (V; k ¢ k) is called a Banach space, if it is complete, i.e., if any Cauchy sequence fvkglN of elements vk 2 V; k 2 lN; converges to an element v 2 V (kvk ¡ vk ! 0 as k ! 1). Examples: Let ­ be a domain in lRd; d 2 lN. Then, the space C(­) of continuous functions on ­ is a Banach space with the norm kukC(­) := sup ju(x)j : x2­ The spaces Lp(­); 1 · p < 1; of (in the Lebesgue sense) p-integrable functions are Banach spaces with the norms Z ³ ´1=p p kukLp(­) := ju(x)j dx : ­ The space L1(­) of essentially bounded functions on ­ is a Banach space with the norm kukL1(­) := ess sup ju(x)j : x2­ The (topologically and algebraically) dual space V ¤ is the space of all bounded linear functionals ¹ : V ! lR. Given ¹ 2 V ¤, for ¹(v) we often write h¹; vi with h¢; ¢i denoting the dual product between V ¤ and V . We note that V ¤ is a Banach space equipped with the norm j h¹; vi j k¹k := sup : v2V nf0g kvk Examples: The dual of C(­) is the space M(­) of Radon measures ¹ with Z h¹; vi := v d¹ ; v 2 C(­) : ­ The dual of L1(­) is the space L1(­). The dual of Lp(­); 1 < p < 1; is the space Lq(­) with q being conjugate to p, i.e., 1=p + 1=q = 1.
    [Show full text]
  • Set Optimization with Respect to Variable Domination Structures
    Set optimization with respect to variable domination structures Dissertation zur Erlangung des Doktorgrades der Naturwissenschaften (Dr. rer. nat.) der Naturwissenschaftlichen Fakult¨atII Chemie, Physik und Mathematik der Martin-Luther-Universit¨at Halle-Wittenberg vorgelegt von Frau Le Thanh Tam geb. am 05.05.1985 in Phu Tho Gutachter: Frau Prof. Dr. Christiane Tammer (Martin-Luther-Universit¨atHalle-Wittenberg) Herr Prof. Dr. Akhtar Khan (Rochester Institute of Technology) Tag der Verteidigung: 02.10.2018 List of Figures K 2.1 Set relations t ; t 2 fl; u; pl; pu; cl; cug in the sense of Definition 2.2.5. 20 4.1 Illustration for Example 4.2.4......................... 45 4.2 Illustration for Theorems 4.3.1, 4.3.2 and 4.3.5............... 51 4.3 Illustration for Theorems 4.3.7 and 4.3.8.................. 53 5.1 Illustration for Example 5.1.7......................... 60 5.2 Illustration for Example 5.1.10........................ 61 5.3 Illustration for Example 5.1.11........................ 62 5.4 Illustration for Example 5.1.15........................ 65 5.5 A model of location optimization w.r.t. variable domination structures. 75 6.1 Illustration for Example 6.1.1......................... 87 6.2 Illustration for Example 6.1.2......................... 88 8.1 Dose response curves in lung cancer treatment............... 110 8.2 Discretization of patient into voxels and of beam into bixels, [32]..... 111 2 8.3 Visualization of two outcome sets A; Be ⊂ R of a uncertain bicriteria optimization problem with undesired elements............... 129 1 Contents 1 Introduction1 2 Preliminaries7 2.1 Binary Relations...............................7 2.1.1 Linear Spaces, Topological Vector Spaces and Normed Spaces..7 2.1.2 Order Relations on Linear Spaces.................
    [Show full text]
  • Arxiv:2104.13510V2 [Math.FA]
    Fenchel-Rockafellar Theorem in Infinite Dimensions via Generalized Relative Interiors D. V. Cuong 1,2, B. S. Mordukhovich3, N. M. Nam4, G. Sandine5 Abstract. In this paper we provide further studies of the Fenchel duality theory in the general framework of locally convex topological vector (LCTV) spaces. We prove the validity of the Fenchel strong duality under some qualification conditions via generalized relative interiors imposed on the epigraphs and the domains of the functions involved. Our results directly generalize the classical Fenchel-Rockafellar theorem on strong duality from finite dimensions to LCTV spaces. Key words. Relative interior, quasi-relative interior, intrinsic relative interior, quasi-regularity, Fenchel duality. AMS subject classifications. 49J52, 49J53, 90C31 1 Introduction Duality theory has a central role in optimization theory and its applications. From a primal optimization problem with the objective function defined in a primal space, a dual problem in the dual space is formulated with the hope that the new problem is easier to solve, while having a close relationship with the primal problem. Its important role in optimization makes the duality theory attractive for extensive research over the past few decades; see, e.g., [1, 2, 5, 6, 7, 8, 9, 10, 11, 13, 14, 18, 19, 20, 25, 26, 28] and the references therein. Given a proper convex function f and a proper concave function g defined on Rn, the classical Fenchel-Rockafellar theorem asserts that inf f(x) g(x) x Rn = sup g (x∗) f ∗(x∗) x∗ Rn , (1.1) { − | ∈ } { ∗ − | ∈ } under the assumption that ri (dom (f)) ri (dom (g)) = ; see [27, Theorem 31.1].
    [Show full text]
  • The Quadratic Sub-Lagrangian of Prox-Regular Funct Ions
    University of Alberta The Quadratic Sub-Lagrangian of Prox-Regular Funct ions Wzren L. Hare O A thesis submitted to the Faculty of Graduate Studies and Research in partial ikiEUment of the requirements for the degree of Mâster of Science in Mathernat ics Department of Mathematics Edmonton, Alberta Spring 2000 National Library Bibliothèque nationale 1*1 ofCanada du Canada Acquisitions and Acquisitions et Bibliographic Services services bibliographiques 395 Wellington Street 395. rue Wellington Ottawa ON KIA ON4 Ottawa ON KIA ON4 Canada racla The author has granted a non- L'auteur a accordé une licence non exclusive licence allowing the exclusive permettant à la National Lhrary of Canada to Bibliothèque nationale du Canada de reproduce, loan, distri%uteor seIl reproduire, prêter, distribuer ou copies of this thesis in microform, vendre des copies de cette thèse sous paper or electronic formats. la fonne de microfiche/film, de reproduction sur papier ou sur format électronique. The author retains ownership of the L'auteur conserve la propriété du copyright in this thesis. Neither the droit d'auteur qui protège cette thèse. thesis nor substantid extracts fiom it Ni la thèse ni des extraits substantiels may be printed or othenvise de celle-ci ne doivent être imprimés reproduced without the author's ou autrement reproduits sans son permission. autorisation, Abstract Given a finite valured convex function, f, the Cr-Lagrangian, as defined by Lemaréchal, Oustr-jr, and Sagastiz&bal, provides an envelcpe of f that maintains these propemties. By selecting the appropriate UV decomposition on Rnone can also ensure that the U-Lagrangian is not only continuous at the origin, but differentiab-le there.
    [Show full text]
  • Lecture Slides on Convex Analysis And
    LECTURE SLIDES ON CONVEX ANALYSIS AND OPTIMIZATION BASED ON 6.253 CLASS LECTURES AT THE MASS. INSTITUTE OF TECHNOLOGY CAMBRIDGE, MASS SPRING 2014 BY DIMITRI P. BERTSEKAS http://web.mit.edu/dimitrib/www/home.html Based on the books 1) “Convex Optimization Theory,” Athena Scien- tific, 2009 2) “Convex Optimization Algorithms,” Athena Sci- entific, 2014 (in press) Supplementary material (solved exercises, etc) at http://www.athenasc.com/convexduality.html LECTURE 1 AN INTRODUCTION TO THE COURSE LECTURE OUTLINE The Role of Convexity in Optimization • Duality Theory • Algorithms and Duality • Course Organization • HISTORY AND PREHISTORY Prehistory: Early 1900s - 1949. • Caratheodory, Minkowski, Steinitz, Farkas. − Properties of convex sets and functions. − Fenchel - Rockafellar era: 1949 - mid 1980s. • Duality theory. − Minimax/game theory (von Neumann). − (Sub)differentiability, optimality conditions, − sensitivity. Modern era - Paradigm shift: Mid 1980s - present. • Nonsmooth analysis (a theoretical/esoteric − direction). Algorithms (a practical/high impact direc- − tion). A change in the assumptions underlying the − field. OPTIMIZATION PROBLEMS Generic form: • minimize f(x) subject to x C ∈ Cost function f : n , constraint set C, e.g., ℜ 7→ ℜ C = X x h1(x) = 0,...,hm(x) = 0 ∩ | x g1(x) 0,...,gr(x) 0 ∩ | ≤ ≤ Continuous vs discrete problem distinction • Convex programming problems are those for which• f and C are convex They are continuous problems − They are nice, and have beautiful and intu- − itive structure However, convexity permeates
    [Show full text]
  • Discrete Convexity and Log-Concave Distributions in Higher Dimensions
    Discrete Convexity and Log-Concave Distributions In Higher Dimensions 1 Basic Definitions and notations n 1. R : The usual vector space of real n− tuples n 2. Convex set : A subset C of R is said to be convex if λx + (1 − λ)y 2 C for any x; y 2 C and 0 < λ < 1: n 3. Convex hull : Let S ⊆ R . The convex hull denoted by conv(S) is the intersection of all convex sets containing S: (or the set of all convex combinations of points of S) n 4. Convex function : A function f from C ⊆ R to (−∞; 1]; where C is a n convex set (for ex: C = R ). Then, f is convex on C iff f((1 − λ)x + λy) ≤ (1 − λ)f(x) + λf(y); 0 < λ < 1: 5. log-concave A function f is said to be log-concave if log f is concave (or − log f is convex) n 6. Epigraph of a function : Let f : S ⊆ R ! R [ {±∞}: The set f(x; µ) j x 2 S; µ 2 R; µ ≥ f(x)g is called the epigraph of f (denoted by epi(f)). 7. The effective domain (denoted by dom(f)) of a convex function f on S n is the projection on R of the epi(f): dom(f) = fx j 9µ, (x; µ) 2 epi(f)g = fx j f(x) < 1g: 8. Lower semi-continuity : An extended real-valued function f given n on a set S ⊂ R is said to be lower semi-continuous at a point x 2 S if f(x) ≤ lim inf f(xi) for every sequence x1; x2; :::; 2 S s.t xi ! x and limit i!1 of f(x1); f(x2); ::: exists in [−∞; 1]: 1 2 Convex Analysis Preliminaries In this section, we recall some basic concepts and results of Convex Analysis.
    [Show full text]
  • Convex Analysis
    Convex Analysis Taught by Professor Adrian Lewis Notes by Mateo D´ıaz [email protected] Cornell University Spring, 2018 Last updated May 18, 2018. The latest version is online here. Notes template by David Mehrle. Contents 1 Convex Sets .................................................. 4 2 Convex functions............................................... 6 2.1 Convexity and calculus......................................... 7 2.1.1 Gradients ............................................ 7 2.1.2 Recognizing smooth convexity................................ 8 2.2 Continuity of convex functions.................................... 8 2.3 Subgradients............................................... 9 3 Duality..................................................... 10 3.1 Fenchel Duality............................................. 10 3.2 Conic programs............................................. 12 3.3 Convex calculus............................................. 14 3.4 Lagrangian duality........................................... 15 4 Algorithms................................................... 19 4.1 Augmented Lagrangian Method ................................... 19 4.2 Proximal Point method and fixed points............................... 20 4.3 Smooth Minimization.......................................... 25 4.4 Splitting Problems and Alternating projections........................... 26 4.5 Proximal Gradient ........................................... 28 4.6 Douglas-Rachford............................................ 29 4.6.1 Consensus optimization...................................
    [Show full text]
  • Lecture 13 Introduction to Quasiconvex Analysis
    Lecture 13 Introduction to quasiconvex analysis Didier Aussel Univ. de Perpignan – p.1/48 Outline of lecture 13 I- Introduction II- Normal approach a- First definitions b- Adjusted sublevel sets and normal operator III- Quasiconvex optimization a- Optimality conditions b- Convex constraint case c- Nonconvex constraint case – p.2/48 Quasiconvexity A function f : X → IR ∪ {+∞} is said to be quasiconvex on K if, for all x, y ∈ K and all t ∈ [0, 1], f(tx + (1 − t)y) ≤ max{f(x),f(y)}. – p.3/48 Quasiconvexity A function f : X → IR ∪ {+∞} is said to be quasiconvex on K if, for all λ ∈ IR, the sublevel set Sλ = {x ∈ X : f(x) ≤ λ} is convex. – p.3/48 Quasiconvexity A function f : X → IR ∪ {+∞} is said to be quasiconvex on K if, for all λ ∈ IR, the sublevel set Sλ = {x ∈ X : f(x) ≤ λ} is convex. A function f : X → IR ∪ {+∞} is said to be semistrictly quasiconvex on K if, f is quasiconvex and for any x,y ∈ K, f(x) <f(y) ⇒ f(z) <f(y), ∀ z ∈ [x,y[. – p.3/48 I Introduction – p.4/48 f differentiable f is quasiconvex iff df is quasimonotone iff df(x)(y − x) > 0 ⇒ df(y)(y − x) ≥ 0 f is quasiconvex iff ∂f is quasimonotone iff ∃ x∗ ∈ ∂f(x) : hx∗,y − xi > 0 ⇒∀ y∗ ∈ ∂f(y), hy∗,y − xi≥ 0 – p.5/48 Why not a subdifferential for quasiconvex programming? – p.6/48 Why not a subdifferential for quasiconvex programming? No (upper) semicontinuity of ∂f if f is not supposed to be Lipschitz – p.6/48 Why not a subdifferential for quasiconvex programming? No (upper) semicontinuity of ∂f if f is not supposed to be Lipschitz No sufficient optimality condition HH¨ x¯ ∈ Sstr(∂f,C)=¨¨⇒H x¯ ∈ arg min f C – p.6/48 II Normal approach of quasiconvex analysis – p.7/48 II Normal approach a- First definitions – p.8/48 A first approach Sublevel set: Sλ = {x ∈ X : f(x) ≤ λ} > Sλ = {x ∈ X : f(x) < λ} Normal operator: X∗ Define Nf (x) : X → 2 by Nf (x) = N(Sf(x),x) ∗ ∗ ∗ = {x ∈ X : hx ,y − xi≤ 0, ∀ y ∈ Sf(x)}.
    [Show full text]
  • Nonparametric Estimation of Multivariate Convex-Transformed Densities.” DOI: 10.1214/10-AOS840SUPP
    The Annals of Statistics 2010, Vol. 38, No. 6, 3751–3781 DOI: 10.1214/10-AOS840 c Institute of Mathematical Statistics, 2010 NONPARAMETRIC ESTIMATION OF MULTIVARIATE CONVEX-TRANSFORMED DENSITIES By Arseni Seregin1 and Jon A. Wellner1,2 University of Washington We study estimation of multivariate densities p of the form p(x)= h(g(x)) for x ∈ Rd and for a fixed monotone function h and an un- known convex function g. The canonical example is h(y)= e−y for y ∈ R; in this case, the resulting class of densities P(e−y)= {p = exp(−g) : g is convex} is well known as the class of log-concave densities. Other functions h allow for classes of densities with heavier tails than the log-concave class. We first investigate when the maximum likelihood estimatorp ˆ exists for the class P(h) for various choices of monotone transfor- mations h, including decreasing and increasing functions h. The re- sulting models for increasing transformations h extend the classes of log-convex densities studied previously in the econometrics literature, corresponding to h(y) = exp(y). We then establish consistency of the maximum likelihood esti- mator for fairly general functions h, including the log-concave class P(e−y) and many others. In a final section, we provide asymptotic minimax lower bounds for the estimation of p and its vector of deriva- tives at a fixed point x0 under natural smoothness hypotheses on h and g. The proofs rely heavily on results from convex analysis. 1. Introduction and background. 1.1. Log-concave and r-concave densities.
    [Show full text]
  • CONVEX ANALYSIS and NONLINEAR OPTIMIZATION Theory and Examples
    CONVEX ANALYSIS AND NONLINEAR OPTIMIZATION Theory and Examples JONATHAN M. BORWEIN Centre for Experimental and Constructive Mathematics Department of Mathematics and Statistics Simon Fraser University, Burnaby, B.C., Canada V5A 1S6 [email protected] http://www.cecm.sfu.ca/ jborwein ∼ and ADRIAN S. LEWIS Department of Combinatorics and Optimization University of Waterloo, Waterloo, Ont., Canada N2L 3G1 [email protected] http://orion.uwaterloo.ca/ aslewis ∼ To our families 2 Contents 0.1 Preface . 5 1 Background 7 1.1 Euclidean spaces . 7 1.2 Symmetric matrices . 16 2 Inequality constraints 22 2.1 Optimality conditions . 22 2.2 Theorems of the alternative . 30 2.3 Max-functions and first order conditions . 36 3 Fenchel duality 42 3.1 Subgradients and convex functions . 42 3.2 The value function . 54 3.3 The Fenchel conjugate . 61 4 Convex analysis 78 4.1 Continuity of convex functions . 78 4.2 Fenchel biconjugation . 90 4.3 Lagrangian duality . 103 5 Special cases 113 5.1 Polyhedral convex sets and functions . 113 5.2 Functions of eigenvalues . 120 5.3 Duality for linear and semidefinite programming . 126 5.4 Convex process duality . 132 6 Nonsmooth optimization 143 6.1 Generalized derivatives . 143 3 6.2 Nonsmooth regularity and strict differentiability . 151 6.3 Tangent cones . 158 6.4 The limiting subdifferential . 167 7 The Karush-Kuhn-Tucker theorem 176 7.1 An introduction to metric regularity . 176 7.2 The Karush-Kuhn-Tucker theorem . 184 7.3 Metric regularity and the limiting subdifferential . 191 7.4 Second order conditions . 197 8 Fixed points 204 8.1 Brouwer's fixed point theorem .
    [Show full text]