Enhancing C++ with Just-In-Time Compilation

Enhancing C++ with Just-In-Time Compilation

ClangJIT: Enhancing C++ with Just-in-Time Compilation Hal Finkel David Poliakoff Jean-Sylvain Camier Leadership Computing Facility Sandia National Laboratories and David F. Richards Argonne National Laboratory Albuquerque, NM 87123, USA Lawrence Livermore National Laboratory Lemont, IL 60439, USA Email: [email protected] Livermore, CA 94550, USA Email: hfi[email protected] Email: [email protected] and [email protected] Abstract—The C++ programming language is not only a A significant design requirement for ClangJIT is that the keystone of the high-performance-computing ecosystem but has runtime-compilation process not explicitly access the file sys- proven to be a successful base for portable parallel-programming tem - only loading data from the running binary is permitted - frameworks. As is well known, C++ programmers use templates to specialize algorithms, thus allowing the compiler to generate which allows for deployment within environments where file- highly-efficient code for specific parameters, data structures, and system access is either unavailable or prohibitively expensive. so on. This capability has been limited to those specializations that In addition, this requirement maintains the redistributibility can be identified when the application is compiled, and in many of the binaries using the JIT-compilation features (i.e., they critical cases, compiling all potentially-relevant specializations is can run on systems where the source code is unavailable). not practical. ClangJIT provides a well-integrated C++ language extension allowing template-based specialization to occur during For example, on large HPC deployments, especially on su- program execution. This capability has been implemented for use percomputers with distributed file systems, extreme care is in large-scale applications, and we demonstrate that just-in-time- required whenever many simultaneously-running processes compilation-based dynamic specialization can be integrated into might access many files on the file system, and ClangJIT applications, often requiring minimal changes (or no changes) to should elide these concerns altogether. the applications themselves, providing significant performance improvements, programmer-productivity improvements, and de- Moreover, as discussed below, ClangJIT achieves another creased compilation time. important design goal of maximal, incremental reuse of the Index Terms—C++, Clang, LLVM, Just-in-Time, Specialization state of the compiler. As is well known, compiling C++ code, especially when many templates are involved, can be a slow process. ClangJIT does not start this process afresh for the I. INTRODUCTION AND RELATED WORK entire translation unit each time a new template instantiation is The C++ programming language is well-known for its requested. Instead, it starts with the state generated by the AoT design doctrine of leaving no room below it for another compilation process, and from there, adds to it incrementally portable programming language [1]. As compiler technology as new instantiations are required. This minimizes the time has advanced, however, it has become clear that using just- associated with the runtime compilation process. in-time (JIT) compilation to provide runtime specialization The hypothesis of this work is that, for production-relevant can provide performance benefits practically unachievable with C++ libraries used for this kind of specialization, incremen- purely ahead-of-time (AoT) compilation. Some of the benefits tal JIT compilation can produce performance benefits while of runtime specialization can be realized using aggressive simultaneously decreasing compilation time. The initial eval- multiversioning, both manual and automatic, along with run- uations presented in Section V confirm this hypothesis, and time dispatch across these pre-generated code variants (this future work will explore applicability and benefits in more technique has been applied for many decades; e.g., [2]). This detail. technique, however, comes with combinatorial compile-time cost, and as a result, is practically limited. Moreover, most of A. Related Work this compile-time cost from multiversioning is wasted when only a small subset of the specialized variants are actually Clang, and thus ClangJIT, are built on top of the LLVM used (as is often the case in practice). This paper introduces compiler infrastructure [3]. The LLVM compiler infrastructure ClangJIT, an extension to the Clang C++ compiler which in- has been specifically designed to support both AoT and JIT tegrates just-in-time compilation into the otherwise-ahead-of- compilation, and has been used to implement JIT compilers time-compiled C++ programming language. ClangJIT allows for a variety of languages both general purpose (e.g., Java [4], programmers to take advantage of their existing body of C++ Haskell [5], Lua [6], Julia [7]), and domain specific (e.g., code, but critically, defer the generation and optimization of TensorFlow/XLA [8]). In addition, C++ libraries implemented template specializations until runtime using a relatively-natural using LLVM to provide runtime specialization for specific do- extension to the core C++ programming language. mains are not uncommon (e.g., TensorFlow/XLA, Halide [9]). Several existing projects have been developed to add dy- user can enable JIT-compilation support in the compiler simply namic capabilities to the C++ programming language [10]. A by using the command line flang -fjit. Using this flag, significant number of these rely on running the compiler as both when compiling and when linking, is all that should be a separate process in order to generate a shared library, and necessary to make using the JIT-compilation features possible. that shared library is then dynamically loaded into the running By itself, however, the command-line flag does not enable process (e.g., [11], [12]). Unfortunately, not only do such any use of JIT compilation. To do that, function templates systems require being able to execute a compiler with access can be tagged for JIT compilation by using the C++ attribute to the relevant source files at runtime, but careful management [[clang::jit]]. An attributed function template provides is required to constrain the cost of the runtime compilation for additional features and restrictions. These features are: processes, because each time the compiler is spawned it must • Instantiations of this function template will not be con- start processing its inputs afresh. NVIDIA’s CUDA Toolkit structed at compile time, but rather, calling a specializa- contains NVRTC [13] allowing the compilation of CUDA code tion of the template, or taking the address of a special- provided as a string at runtime. The Easy::JIT project [14] ization of the template, will trigger the instantiation and provides a limited JIT-based runtime specialization capability compilation of the template at runtime. for C++ lambda functions to Clang, but this specialization is • Non-constant expressions may be provided for the non- limited to parameter values, because the types are fixed during type template parameters, and these values will be used AoT compilation (a fork of Easy::JIT known as atJIT [15] at runtime to construct the type of the requested instan- adds some autotuning capabilities). LambdaJIT [16] also pro- tiation. See Listing 1 for a simple example. vides for the dynamic compilation of lambda functions, and • Type arguments to the template can be provided as also their automated parallelization for GPUs. NativeJIT [17] strings. If the argument is implicitly convertible to a provides an in-process JIT for a subset of C for x86 64. The const char *, then that conversion is performed, closest known work to ClangJIT is CERN’s Cling [18] project. and the result is used to identify the requested type. Cling also implements a JIT for C++ code using a modified Otherwise, if an object is provided, and that object has version of Clang, but Cling’s goals are very different from a member function named c_str(), and the result of the goals of ClangJIT. Cling effectively turns C++ into a that function can be converted to a const char *, JIT-compiled scripting language, including a REPL interface. then the call and conversion (if necessary) are performed ClangJIT, on the other hand, is designed for high-performance, in order to get a string used to identify the type. The incremental compilation of template instantiations using only string is parsed and analyzed to identify the type in the information embedded in the hosting binary. Cling provides declaration context of the parent of the function triggering a dynamic compiler for C++ code, while ClangJIT provides the instantiation. Whether types defined after the point in a language extension for embedded JIT compilation, and as a the source code that triggers the instantiation are available result, the two serve different use cases and have significantly- is not specified. See Listing 2 for a demonstration of different implementations. this functionality, with Listing 3 showing some example Compared to previous work, ClangJIT embodies the fol- output. lowing technical advances: First, ClangJIT provides a C++ JIT-compilation feature that naturally integrates with the C++ Listing 1: A JIT “hello world”. language, thus providing a “single source” solution, while still #include <i o s t r e a m> allowing for dynamic type creation. Previous work relies on #include <c s t d l i b > providing C++ source code directly to the JIT-compilation t e m p l a t e <i n tx > engine. Second, ClangJIT provides

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    14 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us