Fashionable Modelling with Flux Michael J Innes Elliot Saba Keno Fischer Julia Computing, Inc. Julia Computing, Inc. Julia Computing, Inc. Edinburgh, UK Cambridge, MA, USA Cambridge, MA, USA
[email protected] [email protected] [email protected] Dhairya Gandhi Marco Concetto Rudilosso Julia Computing, Inc. University College London Bangalore, India London, UK
[email protected] [email protected] Neethu Mariya Joy Tejan Karmali Birla Institute of Technology and Science National Institute of Technology Pilani, India Goa, India
[email protected] [email protected] Avik Pal Viral B. Shah Indian Institute of Technology Julia Computing, Inc. Kanpur, India Cambridge, MA, USA
[email protected] [email protected] Abstract Machine learning as a discipline has seen an incredible surge of interest in recent years due in large part to a perfect storm of new theory, superior tooling, renewed interest in its capabilities. We present in this paper a framework named Flux that shows how further refinement of the core ideas of machine learning, built upon the foundation of the Julia programming language, can yield an environment that is simple, easily modifiable, and performant. We detail the fundamental principles of Flux as a framework for differentiable programming, give examples of models that are implemented within Flux to display many of the language and framework-level features that contribute to its ease of use and high productivity, display internal compiler techniques used to enable the acceleration and performance that lies at arXiv:1811.01457v3 [cs.PL] 10 Nov 2018 the heart of Flux, and finally give an overview of the larger ecosystem that Flux fits inside of.