Paper, We Present Swift for Tensorflow, a Platform Deploying Machine-Learned Models on Edge Devices Can for Machine Learning

Paper, We Present Swift for Tensorflow, a Platform Deploying Machine-Learned Models on Edge Devices Can for Machine Learning

SWIFT FOR TENSORFLOW:A PORTABLE, FLEXIBLE PLATFORM FOR DEEP LEARNING Brennan Saeta 1 Denys Shabalin 1 Marc Rasi 1 Brad Larson 1 Xihui Wu 1 Parker Schuh 1 Michelle Casbon 1 Daniel Zheng 1 Saleem Abdulrasool 1 Aleksandr Efremov 1 Dave Abrahams 1 Chris Lattner 2 Richard Wei 2 ABSTRACT Swift for TensorFlow is a deep learning platform that scales from mobile devices to clusters of hardware accelerators in data centers. It combines a language-integrated automatic differentiation system and multiple Tensor implementations within a modern ahead-of-time compiled language oriented around mutable value semantics. The resulting platform has been validated through use in over 30 deep learning models and has been employed across data center and mobile applications. 1 INTRODUCTION models can be trained on one desktop GPU. Deep learning has demonstrated remarkable gains in perfor- Today, tools for training deep neural networks at datacen- mance across diverse tasks including game playing (Silver ter scale are often embedded in dynamically typed lan- et al., 2016; Vinyals et al., 2019), image understanding (He guages (Abadi et al., 2016; Paszke et al., 2019; Bradbury et al., 2016; Szegedy et al., 2016), natural language under- et al., 2020; Innes et al., 2019) and rely on Just-In-Time standing (Devlin et al., 2018), and beyond. Training mod- (JIT) compilation to obtain optimal performance. On the ern deep learning models from scratch—a requirement dur- other end of the spectrum, libraries for neural network in- ing the development of new neural network architectures— ference on mobile devices are compiled before installation. requires enormous amounts of compute (Amodei et al., Deploying neural networks onto devices often involves a 2018). In practice, neural networks are trained on clus- translation step from the datacenter system to the on-device ters containing up to thousands of hardware accelerators execution engine. spread across a supporting data center. In this paper, we present Swift for TensorFlow, a platform Deploying machine-learned models on edge devices can for machine learning. The combination of Swift’s Ahead- deliver functionality and low latency without internet con- of-Time (AOT) compilation, support for mutable value se- nectivity. The popularity of modern edge devices has mo- mantics, and convenient syntax extended with language- tivated research on models for limited hardware (compute integrated automatic differentation yields a surprisingly & memory capacity), and energy constraints (Howard et al., effective platform that scales from mobile phones to dis- 2017; Tan & Le, 2019). Additionally, many models can tributed accelerator clusters. This work demonstrates the be fine-tuned directly on a user’s device without copying value of exploring beyond dynamically typed languages for personal data over a network. While datacenter-scale train- deep learning. More concretely, our contributions are: ing focuses on peak throughput, mobile apps optimize for startup and execution time to deliver fluid user experiences. • Language-integrated automatic differentiation (AD) (Section2). We extend Swift with a source-to- Day to day, machine learning practitioners are likely to source compile-time transformation to automatically develop their models on a single machine with a dedicated generate derivatives for arbitrary Swift functions. GPU for hardware acceleration. Thanks to advancements in The AD system is not coupled with the Tensor transfer learning, recent models (e.g. BERT (Devlin et al., implementation; it can be used with any type that 2018)) have been explicitly designed with pre-training in conforms to the Differentiable protocol and mind. By starting from a pre-trained checkpoint, effective any @differentiable annotated function. 1 2 Google Research, Brain Work done at Google Research, • Mutable value semantics (Section4). We explore Brain. Correspondence to: Brennan Saeta <[email protected]>, mutable value semantics—a combination of value se- Denys Shabalin <[email protected]>. mantics with in-place mutation that supports local rea- Proceedings of the 4 th MLSys Conference, San Jose, CA, USA, soning and referential transparency—in the context of 2021. Copyright 2021 by the author(s). machine learning applications. Swift for TensorFlow’s Swift for TensorFlow protocol Differentiable{ func gradient<A: Differentiable>( associatedtype TangentVector at x: A, : AdditiveArithmetic in f: @differentiable (A) -> Float mutating func move( ) -> A.TangentVector along direction: TangentVector) } Figure 2. Gradient function declaration. Figure 1. Differentiable protocol definition. APIs serve as a case study of how this approach can be applied to provide simple yet powerful APIs. vectors in the tangent spaces of those manifolds. Differential geometry defines derivatives between manifolds as linear We provide a broad evaluation (Section5) of how the same functions between the tangent spaces of these manifolds. programming model can scale across a wide range of en- Consider the function f: (A) -> Float. The gra- vironments from datacenter supercomputers to low-power dient of f at a point is a vector in A.TangentVector. mobile devices. n When A is the flat manifold R , A.TangentVector is Rn and we recover the gradient from multivariable calculus. 2 AUTOMATIC DIFFERENTIATION A Differentiable type also requires a move method The Swift for TensorFlow project extended the Swift lan- that moves a value by the distance in the direction indicated guage to support compile-time source-to-source automatic by a TangentVector. This is known as “exponential differentiation (AD) (Wei et al., 2019). The language exten- map” in differential geometry. sions allow library authors to define “differential operators,” To improve ergonomics, we automatically promote func- which are ordinary Swift higher order functions that com- tions and closures to their @differentiable counter- pute derivatives of passed-in functions. For example, we parts based on their use within their surrounding module. added a gradient function to the Swift standard library When type checking encounters a function value in a con- that evaluates the gradient of a scalar-valued function at a text that requires a differentiable function value, it inserts given point. When a call to gradient(at: 0, in: an implicit conversion and flags the original function for f) is compiled, a compiler stage synthesizes a function that compile-time differentiation. For example, users can pass can be evaluated at runtime to return the derivative of f, an unannotated closure to the standard library gradient which gradient calls. Additionally, we support differen- operator shown in Figure2. tiation of arbitrary user-defined types so long as they satisfy a few requirements (Figure1). Our language extensions A differentiable function value is a bundle containing the have been contributed upstream and have been successfully original function value and Jacobian-vector product (JVP) merged into the main branch of the language & compiler. and vector-Jacobian product (VJP) “derivative function” val- ues (Figure3). Each derivative function returns a pair of the 2.1 Language extensions computed value together with a closure called a differential or pullback, respectively. These derivative functions are The key language building blocks for differentiation are: taken from previous work (Maclaurin, 2016) and are par- ticularly close to the JVP and VJP functions in (Bradbury • The Differentiable protocol,1 which encodes re- et al., 2020; Innes et al., 2019). The JVP implements for- quirements on the parameter and return types of differ- ward mode differentiation, and the VJP implements reverse entiable functions (Figure1), and mode differentiation. • The @differentiable (A) -> B function type The AD code transformation transforms a function f into family, which bundles information about how to com- derivative functions that are implemented in terms of deriva- pute the function’s derivative. We call instances of tive functions of f’s callees. The transformation recursively these types differentiable function values. transforms the callees to get their derivative functions. This recursion requires a base case of known derivative func- Every Differentiable type has an associated tions. Our approach allows fully-customizable base deriva- @derivative(of:) TangentVector type, inspired by differential geometry. tive functions via a attribute. Users Differentiable values represent points on differen- write custom derivative functions with this attribute to reg- tiable manifolds, and TangentVector values represent ister the base case, and the code transformation terminates the recursion whenever it encounters a function with a user 1Swift protocols are similar to Haskell typeclasses. specified custom derivative. Swift for TensorFlow Original function (A) -> B JVP (Forward mode AD) (A) -> (B, (A.TangentVector) -> B.TangentVector) VJP (Reverse mode AD) (A) -> (B, (B.TangentVector) -> A.TangentVector) Figure 3. The three elements of a differentiable function value and their types. 2.2 Code transformation the AD code transformation at AOT-compile time was first explored in Swift for TensorFlow. The differentiation code transformation operates on the Swift Intermediate Language (SIL), an intermediate rep- Enzyme (Moses & Churavy, 2020) is a successor system resentation (IR) in static single assignment form. SIL is that shares a similar set of the advantages, only targeting designed specifically for Swift’s compiler transformations, LLVM IR

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    15 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us