Using Path Spectra to Direct Function Cloning Tom Way Lori Pollock

Total Page:16

File Type:pdf, Size:1020Kb

Using Path Spectra to Direct Function Cloning Tom Way Lori Pollock Using Path Spectra to Direct Function Cloning Tom Way Lori Pollock Department of Computer and Information Sciences University of Delaware, Newark, DE 19716 (302) 831-1953 (302) 831-8458 (Fax) way,pollock ¡ @cis.udel.edu Abstract This paper explores the use of path spectra to guide cloning decisions. By instrumenting the program, a While function cloning can improve the precision of in- profiler can report the frequency of execution of indi- terprocedural analysis and thus the opportunity for op- vidual paths executed during a program run [2]1. For timization by changing the structure of the call graph, a particular program run, the path spectrum is the set its successful application relies on the cloning deci- of paths executed along with their execution frequen- sions. This paper explores the use of program spec- cies [14]. As an indication of program behavior, path tra comparisons for guiding cloning decisions. Our hy- spectra have been used successfully for program opti- pothesis is that this approach provides a good heuris- mization [1, 8, 9, 10] as well as software maintenance tic for determining which calls contribute different dy- and testing [14]. namic interprocedural information and thus suggest good candidates for cloning for the purpose of improv- Our approach to making cloning decisions is based on ing optimization. comparing the program spectra for the execution of the same function by different calls to that function. Keywords: function cloning, path spectra, profile- Our hypothesis is that this approach provides a good guided optimization heuristic for determining which calls contribute differ- ent dynamic interprocedural information and thus sug- gest good candidates for cloning. Path profiles indi- 1 Introduction cate run-time control flow, which will have an effect on the actual dynamic data flow and data dependen- As modular and object-oriented programming are be- cies within the callee; path profiles for different calls coming the norm, and architectures offer more avail- to the same function indicate differences in data val- able fine-grain parallelism, the importance of analysis ues flowing into the function through parameters or and optimization across function boundaries continues via global variables. In addition, profile-guided opti- to increase. While function inlining allows each func- mization and path-qualified data flow analysis within tion to be optimized within the separate context of each the cloned functions may benefit from creating clones call site, its well known limitation is the potentially ex- based on similar run-time paths through the function. ponential code growth and associated increased com- The technique of using program spectra comparison for pile time. The most common alternative is interproce- cloning decisions is different from, but could be used dural data flow analysis, which avoids the code growth, in combination with, the use of program profiling to but is limited by the program’s original calling struc- uncover the most frequently executed calls to guide in- ture, in particular, making conservative assumptions at lining decisions. the convergence of paths in the call graph. Function cloning seeks to play an intermediary role by parti- The objective of cloning decisions is threefold: (1) in- tioning the calls in the program and creating multiple crease program performance, (2) manage compile time, copies of the function body, one for each set of “closely and (3) control code growth. We have been evaluat- related” calls. This graph restructuring can create op- ing our technique in terms of its effectiveness in mak- portunity to compute more precise information during ing intelligent cloning decisions that will lead to im- interprocedural analysis because each node represent- proved optimization as well as its associated overhead. ing a copy of the function has fewer incoming edges 1Actually, the length of the paths that are profiled must be lim- on which information must be merged during analy- ited, as there can be an unbounded number of paths for general flow sis; cloning also seeks to avoid the potential exponential graphs and an exponential number for directed acyclic graphs (i.e., code growth of inlining. flow graphs without loop backedges included). Way & Pollock 1 Our preliminary experiments have focused on compar- namic dispatches, providing significant improvements ing this approach to goal-directed cloning [11] which in performance and reduced code growth over cus- is based on static interprocedural data flow analysis, tomization [5], the previous state-of-the-art specializa- directed toward particular optimizations. Thus far in tion technique. In dynamic compilation environments, our limited study, we have found that cloning based on cloning is sometimes performed on the fly as a state- path spectra comparisons can produce results compara- ment is executed the first time [13]. To our knowledge, ble to that of goal-directed cloning for constant prop- none of these techniques has used path spectra compar- agation, with compile-time and run-time analysis that ison in their cloning decisions. can be reused for cloning that is performed for multiple optimization goals. Path profiling has been used successfully in compiler optimization [1, 9, 10]. Other basic types of con- We first describe the related work on cloning, profil- trol flow profiling are edge profiling, which measures ing, and path spectra comparisons. A simple motivat- the execution frequency of each individual flow graph ing example for our approach is presented, followed by edge, and basic block profiling, which measures how our profile-directed cloning algorithm. We describe our many times each basic block is executed. Edge pro- path spectra comparison technique which is used to par- files are good predictors of frequently executed paths tition call sites for cloning. Finally, this method is com- (hot paths) for programs with a large amount of defi- pared to goal-directed cloning decisions on an example nite flow relative to total flow, while path profiles are code, and we make some concluding remarks and sum- better when there is less definite flow [3]. Path profiles marize our future work. can be collected efficiently and provide more accurate information than edge profiling [2]. In particular, dif- ferent path profiles can result in the same edge profile, 2 Related Work making it impossible to accurately compute the execu- tion frequency of paths based on edge profile informa- Goal-directed cloning [6, 11] first solves a forward in- tion. In this paper, we use path profiling rather than terprocedural data flow problem with slight modifica- edge profiling as we are interested in the differences, tion in order to compute a set of cloning vectors for or actually, the similarities in the run-time behavior of the particular data flow problem of interest at each call different invocations of a given function. Differences in graph node. Cloning vectors which produce equivalent path spectra obtained from two different calls to a func- effects on the optimization of interest are merged, and tion with different parameter values indicate differences finally the cloning is performed until the program size in the execution states, and therefore the function’s be- reaches some threshold. Cooper et al. [6] presented an havior, due to differences in the parameter values. Edge experiment on the matrix300 code from release one or basic block profiling could be used, but comparisons of the SPEC benchmark suite, in which they showed of spectra created from path profiles generally will give that significant improvement in code quality could be better predictions of differences in run-time behavior. attained by using this method to expose sufficient in- When variables cannot be conservatively identified as formation to perform inlining and unroll and jam. The constants at compile time, value profiling [4] can be approach requires either knowledge of what optimiza- used to determine whether they exhibit a high degree of tions would most likely benefit from cloning in order invariant behavior at run-time. Value profiling records to focus on that forward data flow problem, or a sepa- information about the invariance of a variable, typically rate clone decision-making phase for each optimization the top N values for an instruction and the number of of interest, and then some heuristic to select the clones occurrences for each of those values. In addition to suggested by each phase. Only goal-directed cloning other compilation and optimization decisions, this in- for constant propagation has been investigated. It is not formation can be used for duplicating and specializing clear how easy it would be to construct a goal-directed code conditioned on particular high frequency values. partitioning algorithm for other optimizations or how While value profiling will certainly aid in cloning deci- much goal-directed cloning could improve the opportu- sions with the goal of specializing on constant values, nity for other optimizations, or even more challenging, we believe that cloning that utilizes path spectra dif- for multiple simultaneous optimization goals. ferences as proposed in this paper may provide a more A general framework for selective specialization, the general, or at least complementary, approach to judg- equivalent of cloning for object-oriented languages, ing the run-time behavioral differences between sepa- combines static analysis and profile data to identify the rate calls to the same function, and enable cloning de- most profitable specializations [7]. This goal-directed
Recommended publications
  • Redacted for Privacy Professor Donald Guthrie, Jr
    AN ABSTRACT OF THE THESIS OF John Anthony Battilega for the DOCTOR OF PHILOSOPHY (Name) (Degree) in Mathematics presented on May 4, 1973 (Major) (Date) Title: COMPUTATIONAL IMPROVEMENTS TO BENDERS DECOMPOSITION FOR GENERALIZED FIXED CHARGE PROBLEMS Abstract approved Redacted for privacy Professor Donald Guthrie, Jr. A computationally efficient algorithm has been developed for determining exact or approximate solutions for large scale gener- alized fixed charge problems. This algorithm is based on a relaxa- tion of the Benders decomposition procedure, combined with a linear mixed integer programming (MIP) algorithm specifically designed to solve the problem associated with Benders decomposition and a com- putationally improved generalized upper bounding (GUB) algorithm which solves a convex separable programming problem by generalized linear programming. A dynamic partitioning technique is defined and used to improve computational efficiency.All component algor- ithms have been theoretically and computationally integrated with the relaxed Benders algorithm for maximum efficiency for the gener- alized fixed charge problem. The research was directed toward the approximate solution of a particular class of large scale generalized fixed charge problems, and extensive computational results for problemsof this type are given.As the size of the problem diminishes, therelaxations can be enforced, resulting in a classical Bendersdecomposition, but with special purpose sub-algorithmsand improved convergence pro- perties. Many of the results obtained apply to the sub-algorithmsinde- pendently of the context in which theywere developed. The proce- dure for solving the associated MIP is applicableto any linear 0/1 problem of Benders form, and the techniquesdeveloped for the linear program are applicable to any large scale generalized GUB implemen- tation.
    [Show full text]
  • Dynamic Functions in Dyalog
    Direct Functions in Dyalog APL John Scholes – Dyalog Ltd. [email protected] A Direct Function (dfn) is a new function definition style, which bridges the gap between named function expressions such as and APL’s traditional ‘header’ style definition. Simple Expressions The simplest form of dfn is: {expr} where expr is an APL expression containing s and s representing the left and right argument of the function respectively. For example: A dfn can be used in any function context ... ... and of course, assigned a name: Dfns are ambivalent. Their right (and if present, left) arguments are evaluated irrespective of whether these are subsequently referenced within the function body. Guards A guard is a boolean-single valued expression followed by . A simple expression can be preceded by a guard, and any number of guarded expressions can occur separated by s. Guards are evaluated in turn (left to right) until one of them yields a 1. Its corresponding expr is then evaluated as the result of the dfn. A guard is equivalent to an If-Then-Else or Switch-Case construct. A final simple expr can be thought of as a default case: The s can be replaced with newlines. For readability, extra null phrases can be included. The parity example above becomes: Named dfns can be reviewed using the system editor: or , and note how you can comment them in the normal way using . The following example interprets a dice throw: Local Definition The final piece of dfn syntax is the local definition. An expression whose principal function is a simple or vector assignment, introduces a name that is local to the dfn.
    [Show full text]
  • Xerox University Microfilms 300 North Zeeb Road Ann Arbor, Michigan 48106 76-15,823
    INFORMATION TO USERS This material was produced from a microfilm copy of the original document. While the most advanced technological means to photograph and reproduce this document have been used, the quality is heavily dependent upon the quality of the original submitted. The following explanation of techniques is provided to help you understand markings or patterns which may appear on this reproduction. 1. The sign or "target" for pages apparently lacking from the document photographed is "Missing Page(s)". If it was possible to obtain the missing page(s) or section, they are spliced into the film along with adjacent pages. This may have necessitated cutting thru an image and duplicating adjacent pages to insure you complete continuity. 2. When an image on the film is obliterated with a large round black mark, it is an indication that the photographer suspected that the copy may have moved during exposure and thus cause a blurred image. You will find a good image of the page in the adjacent frame. 3. When a map, drawing or chart, etc., was part of the material being photographed the photographer followed a definite method in "sectioning" the material. It is customary to begin photoing at the upper left hand corner of a large sheet and to continue photoing from left to right in equal sections with a small overlap. If necessary, sectioning is continued again — beginning below the first row and continuing on until complete. 4. The majority of users indicate that the textual content is of greatest value, however, a somewhat higher quality reproduction could be made from "photographs" if essential to the understanding of the dissertation.
    [Show full text]
  • Ginger Documentation Release 1.0
    Ginger Documentation Release 1.0 sfkl / gjh Nov 03, 2017 Contents 1 Contents 3 2 Help Topics 27 3 Common Syntax 53 4 Design Rationales 55 5 The Ginger Toolchain 83 6 Low-Level Implementation 99 7 Release Notes 101 8 Indices and tables 115 Bibliography 117 i ii Ginger Documentation, Release 1.0 This documentation is still very much work in progress The aim of the Ginger Project is to create a modern programming language and its ecosystem of libraries, documen- tation and supporting tools. The Ginger language draws heavily on the multi-language Poplog environment. Contents 1 Ginger Documentation, Release 1.0 2 Contents CHAPTER 1 Contents 1.1 Overview of Ginger Author Stephen Leach Email [email protected] 1.1.1 Background Ginger is our next evolution of the Spice project. Ginger itself is a intended to be a rigorous but friendly programming language and supporting toolset. It includes a syntax-neutral programming language, a virtual machine implemented in C++ that is designed to support the family of Spice language efficiently, and a collection of supporting tools. Spice has many features that are challenging to support efficiently in existing virtual machines: pervasive multiple values, multiple-dispatch, multiple-inheritance, auto-loading and auto-conversion, dynamic virtual machines, implicit forcing and last but not least fully dynamic typing. The virtual machine is a re-engineering of a prototype interpreter that I wrote on holiday while I was experimenting with GCC’s support for FORTH-like threaded interpreters. But the toolset is designed so that writing alternative VM implementations is quite straightforward - and we hope to exploit that to enable embedding Ginger into lots of other systems.
    [Show full text]
  • Speeches and Papers
    ROOMS 1-2-3 THE IMPACT OF STANDARDIZATION FOR THE 70's (PANEL) ROBERT W. BEMER General Electric Company Phoenix, Arizona CHAIRMAN AND TECHNICAL PROGRAM CHAIRMAN'S MESSAGE , CONFERENCE AWARDS ' - Harry Goode Memorial Award 3 Best Paper Award 3 Best Presentation Award 3 TECHNICAL PROGRAM 4 CONFERENCE-AT-A-GLANCE (Centerfold) 52,53 CONFERENCE LUNCHEON 75 Luncheon Speaker 75 EDUCATION PROGRAM 76 SPECIAL ACTIVITIES 76 Conference Reception 76 Conference Luncheon 76 Computer Science and Art Theater 76 Computer Art Exhibit 77 Computer Music Exhibit 77 Scope Tour 77 LADIES PROGRAM 78 Hospitality Suite 78 Activities 78 GENERAL INFORMATION 79 Conference Location 79 MASSACHUSETTS INSTITUTE OF TECHNOLOGY PROJECT MAC Reply to: Project MAC 545 Technology Square Cambridge, Mass. 02139 June 5, 1969 Telephone: (617 ) 864-6900 x6201 Mr. Robert W. Bemer General Electric Company (C-85) 13430 North Black Canyon Highway Phoenix, Arizona 85029 Dear Bob: Thank you very much for your participation in my panel session at the Spring Joint Computer Conference. On the basis of a number of comments I received from both strangers and friends and also my own assessment, I think we had a successful session. I particularly appreciate your willingness to contribute your time and efforts to discussion of the problems of managing software projects. As I may have told you previously, I have been teaching a seminar on this same subject at M.I.T. A number of my students who attended the session were particularly interested in the presentations by the panelists. I would appreciate it if you could send me copies of your slide material for use in my class.
    [Show full text]
  • 2D3-Induced Genes in Osteoblasts
    THE REGULATION AND FUNCTION OF 1,25-DIHYDROXYVITAMIN D3- INDUCED GENES IN OSTEOBLASTS by AMELIA LOUISE MAPLE SUTTON Submitted in partial fulfillment of the requirements for the Degree of Doctor of Philosophy Advisor: Dr. Paul N. MacDonald Department of Pharmacology CASE WESTERN RESERVE UNIVERSITY August 2005 CASE WESTERN RESERVE UNIVERSITY SCHOOL OF GRADUATE STUDIES We hereby approve the thesis/dissertation of ______________________________________________________ candidate for the ________________________________degree *. (signed)_______________________________________________ (chair of the committee) ________________________________________________ ________________________________________________ ________________________________________________ ________________________________________________ ________________________________________________ (date) _______________________ *We also certify that written approval has been obtained for any proprietary material contained therein. I would like to dedicate this dissertation to my amazing mother. This is the very least I can do to say thank you to the woman who has dedicated her entire life to me. TABLE OF CONTENTS Dedication ii Table of Contents iii List of Tables iv List of Figures v Acknowledgments vii Abstract xiv Chapter I Introduction 1 Chapter II The 1,25(OH)2D3-regulated transcription factor MN1 64 stimulates VDR-medicated transcription and inhibits osteoblast proliferation Chapter III The 1,25(OH)2D3-induced transcription factor 98 CCAAT/enhancer-binding protein-β cooperates with VDR
    [Show full text]
  • FEMA 369 2000 Edition
    Program on Improved Seismic Safety Provisions Of the National Institute of Building Sciences 2000 Edition NEHRP RECOMMENDED PROVISIONS FOR SEISMIC REGULATIONS FOR NEW BUILDINGS AND OTHER STRUCTURES Part 2: Commentary (FEMA 369) The Building Seismic Safety Council (BSSC) was established in 1979 under the auspices of the Na- tional Institute of Building Sciences as an entirely new type of instrument for dealing with the complex regulatory, technical, social, and economic issues involved in developing and promulgating building earth- quake hazard mitigation regulatory provisions that are national in scope. By bringing together in the BSSC all of the needed expertise and all relevant public and private interests, it was believed that issues related to the seismic safety of the built environment could be resolved and jurisdictional problems overcome through authoritative guidance and assistance backed by a broad consensus. The BSSC is an independent, voluntary membership body representing a wide variety of building community interests. Its fundamental purpose is to enhance public safety by providing a national forum that fosters improved seismic safety provisions for use by the building community in the planning, design, construction, regulation, and utilization of buildings. To fulfill its purpose, the BSSC: (1) promotes the development of seismic safety provisions suitable for use throughout the United States; (2) recommends, encourages, and promotes the adoption of appropriate seismic safety provisions in voluntary standards and model codes;
    [Show full text]
  • Computer Science and the Organization of White-Collar Work, 1945-1975
    Post-Industrial Engineering: Computer Science and the Organization of White-Collar Work, 1945-1975 by Andrew Benedict Mamo A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in History in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Cathryn Carson, Chair Professor David Hollinger Professor David Winickoff Spring 2011 Post-Industrial Engineering: Computer Science and the Organization of White-Collar Work, 1945-1975 © 2011 by Andrew Benedict Mamo Abstract Post-Industrial Engineering: Computer Science and the Organization of White-Collar Work, 1945-1975 by Andrew Benedict Mamo Doctor of Philosophy in History University of California, Berkeley Professor Cathryn Carson, Chair The development of computing after the Second World War involved a fundamental reassessment of information, communication, knowledge — and work. No merely technical project, it was prompted in part by the challenges of industrial automation and the shift toward white-collar work in mid-century America. This dissertation therefore seeks out the connections between technical research projects and organization-theory analyses of industrial management in the Cold War years. Rather than positing either a model of technological determinism or one of social construction, it gives a more nuanced description by treating the dynamics as one of constant social and technological co-evolution. This dissertation charts the historical development of what it has meant to work with computers by examining the deep connections between technologists and mid-century organization theorists from the height of managerialism in the 1940s through the decline of the “liberal consensus” in the 1970s. Computing was enmeshed in ongoing debates concerning automation and the relationship between human labor and that of machines.
    [Show full text]
  • Problems of Drug Dependence 1989: Proceedings of the 51St Annual
    National Institute on Drug Abuse MONOGRAPH SERIES Problems of Drug Dependence 1989 Proceedings of the 51st Annual Scientific Meeting The Committee on Problems of Drug Dependence, Inc. U S DEPARTMENT OF HEALTH AND HUMAN SERVICES • Public Health Service • Alcohol, Drug Abuse, and Mental Health Administration Problems of Drug Dependence 1989 Proceedings of the 51st Annual Scientific Meeting The Committee on Problems of Drug Dependence, Inc. Editor: Louis S. Harris, Ph.D. NIDA Research Monograph 95 U.S. DEPARTMENT OF HEALTH AND HUMAN SERVICE Public Health Service Alcohol, Drug Abuse, and Mental Health Administration National Institute on Drug Abuse Off ice of Science 5600 Fishers Lane Rockville, MD 20857 For sale by the Superintendent of Documents, U.S. Government Printing Office Washington, DC 20402 NIDA Research Monographs are prepared by the research divisions of the National Institute on Drug Abuse and published by its Office of Science. The primary objective of the series is to provide critical reviews of research problem areas and techniques, the content of state-of-the-art conferences, and integrative research reviews. Its dual publication emphasis is rapid and targeted dissemination to the scientific and professional community. Editorial Advisors MARTIN W. ADLER, Ph.D. MARY L. JACOBSON Temple University School of Medicine National Federation of Parents for Philadelphia, Pennsylvania Drug-free Youth Omaha, Nebraska SYDNEY ARCHER, Ph.D. Rensselaer Polytechnic Institute Troy, New York REESE T. JONES, M.D. Langley Porter Neuropsychiatric Institute RICHARD E. BELLEVILLE, Ph.D. San Francisco, California NB Associates, Health Sciences Rockville, Maryland DENISE KANDEL, Ph.D. KARST J. BESTEMAN College of Physicians and Surgeons of Alcohol and Drug Problems Association Columbia University of North America New York, New York Washington, D.C.
    [Show full text]
  • Three Implementation Models for Scheme
    Three Implementation Models for Scheme by R. Kent Dybvig A dissertation submitted to the faculty of the University of North Carolina at Chapel Hill in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Computer Science. Chapel Hill 1987 Approved by: Advisor Reader Reader c 1987 R. Kent Dybvig ALL RIGHTS RESERVED R. KENT DYBVIG. Three Implementation Models for Scheme (Under the direc- tion of GYULA A. MAGO.) Abstract This dissertation presents three implementation models for the Scheme Program- ming Language. The first is a heap-based model used in some form in most Scheme implementations to date; the second is a new stack-based model that is consider- ably more efficient than the heap-based model at executing most programs; and the third is a new string-based model intended for use in a multiple-processor im- plementation of Scheme. The heap-based model allocates several important data structures in a heap, including actual parameter lists, binding environments, and call frames. The stack-based model allocates these same structures on a stack whenever possible. This results in less heap allocation, fewer memory references, shorter instruction sequences, less garbage collection, and more efficient use of memory. The string-based model allocates versions of these structures right in the program text, which is represented as a string of symbols. In the string-based model, Scheme programs are translated into an FFP language designed specifically to support Scheme. Programs in this language are directly executed by the FFP machine, a multiple-processor string-reduction computer.
    [Show full text]
  • Measuring Space Systems Flexibility: a Comprehensive Six-Element Framework
    Measuring Space Systems Flexibility: A Comprehensive Six-element Framework by Roshanak Nilchiani M.S. Engineering Mechanics, University of Nebraska-Lincoln, 2002 B.S. Mechanical Engineering, Sharif University of Technology, 1998 SUBMITTED TO THE DEPARTMENT OF AERONAUTICS AND ASTRONAUTICS IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY IN AEROSPACE SYSTEMS AT THE MASSACHUSETTS INSTITUTE OF TECHNOLOGY September 2005 Copyright © 2005 Massachusetts Institute of Technology. All Rights Reserved Author…………………………………………………………….……………………. Space Systems Laboratory, Department of Aeronautics and Astronautics Certified by………………………………………………………………………………. Daniel E. Hastings, Thesis Committee Chair Director, Engineering Systems Division Professor of Engineering Systems and Aeronautics and Astronautics Certified by.…………………………………….…………………….………………… Joseph M. Sussman, Thesis Supervisor Professor of Civil and Environmental Engineering and Engineering Systems Certified by…………………………………………………….………………………… David W. Miller, Thesis Supervisor Associate Professor of Aeronautics and Astronautics Accepted by………………………………………………….………………………………….. Jaime Peraire, Professor of Aeronautics and Astronautics Chair, Committee on Graduate Students 2 Measuring Space Systems Flexibility: A Comprehensive Six-element Framework by Roshanak Nilchiani Submitted To the Department of Aeronautics and Astronautics In Partial Fulfillment of The Requirements For The Degree Of Doctor of Philosophy In Aerospace Systems At the Massachusetts Institute Of Technology Abstract Space systems are extremely delicate and costly engineering artifacts that take a long time to design, manufacture, and launch into space and after they are launched, there is limited access to them. Millions of dollars of space systems assets lost annually, when the space system has failed to meet new market conditions, cannot adapt to new applications, its technology becomes obsolete or when it cannot cope with changes in the environment it operates in.
    [Show full text]
  • Financial Numerical Recipes in C++
    Financial Numerical Recipes in C++. Bernt Arne Ødegaard June 2014 5.1 The interchangeability of discount factors, spot interest rates and forward interest rates . 52 5.2 The term structure as an object . 55 5.2.1 Base class . 55 5.2.2 Flat term structure. 57 5.3 Using the currently observed term structure. 58 5.3.1 Linear Interpolation. 59 5.3.2 Interpolated term structure class. 61 Contents 5.4 Bond calculations with a general term structure and continous compounding . 64 6 The Mean Variance Frontier 67 6.1 Setup . 67 6.2 The minimum variance frontier . 69 1 On C++ and programming. 5 6.3 Calculation of frontier portfolios . 69 1.1 Compiling and linking . 5 6.4 The global minimum variance portfolio . 72 1.2 The structure of a C++ program . 6 6.5 Efficient portfolios . 72 1.2.1 Types . 6 6.6 The zero beta portfolio . 73 1.2.2 Operations . 6 6.7 Allowing for a riskless asset. 73 1.2.3 Functions and libraries . 7 6.8 Efficient sets with risk free assets. 74 1.2.4 Templates and libraries . 7 6.9 Short-sale constraints . 75 1.2.5 Flow control . 8 6.10 The Sharpe Ratio . 75 1.2.6 Input Output . 8 6.11 Equilibrium: CAPM . 76 1.2.7 Splitting up a program . 8 6.11.1 Treynor . 76 1.2.8 Namespaces . 9 6.11.2 Jensen . 76 1.3 Extending the language, the class concept. 9 6.12 Working with Mean Variance and CAPM . 76 1.3.1 date, an example class .
    [Show full text]