Deep Learning with the Theano Python Library

Deep Learning with the Theano Python Library

Hands-on Lab: Deep Learning with the Theano Python Library Frédéric Bastien Montreal Institute for Learning Algorithms Université de Montréal Montréal, Canada [email protected] Presentation prepared with Pierre Luc Carrier and Arnaud Bergeron GTC 2016 Introduction Theano Models Exercises End Slides I PDF of the slides: https://goo.gl/z0tynd I github repo of this presentation https://github.com/nouiz/gtc2016/ 2 / 58 Introduction Theano Models Exercises End Introduction Theano Compiling/Running Modifying expressions GPU Debugging Models Logistic Regression Convolution Exercises End 3 / 58 Introduction Theano Models Exercises End High level Python <- {NumPy/SciPy/libgpuarray} <- Theano <- {...} I Python: OO coding language I Numpy: n-dimensional array object and scientic computing toolbox I SciPy: sparse matrix objects and more scientic computing functionality I libgpuarray: GPU n-dimensional array object in C for CUDA and OpenCL I Theano: compiler/symbolic graph manipulation I (Not a machine learning framework/software) I {...}: Many libraries built on top of Theano 1 / 58 Introduction Theano Models Exercises End What Theano provides I Lazy evaluation for performance I GPU support I Symbolic dierentiation I Automatic speed and stability optimization 2 / 58 Introduction Theano Models Exercises End High level Many [machine learning] library build on top of Theano I Keras I blocks I lasagne I sklearn-theano I PyMC 3 I theano-rnn I Morb I ... 3 / 58 Introduction Theano Models Exercises End Goal of the stack Fast to develop Fast to run 4 / 58 Introduction Theano Models Exercises End Some models build with Theano Some models that have been build with Theano. I Neural Networks I Convolutional Neural Networks, AlexNet, OverFeat, GoogLeNet I RNN, CTC, LSTM, GRU I NADE, RNADE, MADE I Autoencoders I Generative Adversarial Nets I SVMs I many variations of above models and more 5 / 58 Introduction Theano Models Exercises End Project status? I Mature: Theano has been developed and used since January 2008 (8 yrs old) I Driven hundreds of research papers I Good user documentation I Active mailing list with worldwide participants I Core technology for Silicon-Valley start-ups I Many contributors (some from outside our institute) I Used to teach many university classes I Has been used for research at big compagnies I Theano 0.8 released 21th of March, 2016 Theano: deeplearning.net/software/theano/ Deep Learning Tutorials: deeplearning.net/tutorial/ 6 / 58 Introduction Theano Models Exercises End Python I General-purpose high-level OO interpreted language I Emphasizes code readability I Comprehensive standard library I Dynamic type and memory management I Easily extensible with C I Slow execution I Popular in web development and scientic communities 7 / 58 Introduction Theano Models Exercises End NumPy/SciPy I NumPy provides an n-dimensional numeric array in Python I Perfect for high-performance computing I Slices of arrays are views (no copying) I NumPy provides I Elementwise computations I Linear algebra, Fourier transforms I Pseudorandom number generators (many distributions) I SciPy provides lots more, including I Sparse matrices I More linear algebra I Solvers and optimization algorithms I Matlab-compatible I/O I I/O and signal processing for images and audio 8 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Introduction Theano Compiling/Running Modifying expressions GPU Debugging Models Logistic Regression Convolution Exercises End 9 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Description High-level domain-specic language for numeric computation. I Syntax as close to NumPy as possible I Compiles most common expressions to C for CPU and/or GPU I Limited expressivity means more opportunities for optimizations I Strongly typed -> compiles to C I Array oriented -> easy parallelism I Support for looping and branching in expressions I No subroutines -> global optimization I Automatic speed and numerical stability optimizations 10 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Description (2) I Symbolic dierentiation and R op (Hessian Free Optimization) I Can reuse other technologies for best performance I CUDA, CuBLAS, CuDNN, BLAS, SciPy, PyCUDA, Cython, Numba, ... I Sparse matrices (CPU only) I Extensive unit-testing and self-verication I Extensible (You can create new operations as needed) I Works on Linux, OS X and Windows I Multi-GPU I New GPU back-end: I Float16 new back-end (need cuda 7.5) I Multi dtypes I Multi-GPU support in the same process 11 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Simple example import theano # declare symbolic variable a = theano.tensor.vector("a") # build symbolic expression b = a + a ∗∗ 10 # compile function f = theano.function([a], b) # Execute with numerical value p r i n t f ( [ 0 , 1 , 2 ] ) # prints`array([0, 2, 1026])` 12 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Simple example 13 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Overview of library Theano is many things I Language I Compiler I Python library 14 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Scalar math Some example of scalar operations: import theano from theano import t e n s o r as T x = T.scalar() y = T.scalar() z = x + y w = z ∗ x a = T.sqrt(w) b = T. exp ( a ) c = a ∗∗ b d = T. l o g ( c ) 15 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Vector math from theano import t e n s o r as T x = T.vector() y = T.vector() # Scalar math applied elementwise a = x ∗ y # Vector dot product b =T.dot(x, y) # Broadcasting(as NumPy, very powerful) c = a + b 16 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Matrix math from theano import t e n s o r as T x = T.matrix() y = T.matrix() a = T.vector() # Matrix−m a t r i x product b =T.dot(x, y) # Matrix−v e c t o r product c =T.dot(x, a) 17 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Tensors Using Theano: I Dimensionality dened by length of broadcastable argument I Can add (or do other elemwise op) two tensors with same dimensionality I Duplicate tensors along broadcastable axes to make size match from theano import t e n s o r as T tensor3 = T.TensorType( broadcastable=(False , False , False), dtype='float32') x = T.tensor3() 18 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Reductions from theano import tensor asT t e n s o r 3=T.TensorType( broadcastable=(False, False, False), dtype='float32') x= tensor3() t o t a l=x.sum() m a r g i n a l s=x.sum(axis=(0, 2)) mx=x.max(axis=1) 19 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Dimshue from theano import tensor asT t e n s o r 3=T.TensorType( broadcastable=(False, False, False)) x= tensor3() y=x.dimshuffle((2, 1, 0)) a=T.matrix() b=a.T # Same asb c=a.dimshuffle((0, 1)) # Adding to larger tensor d=a.dimshuffle((0, 1,'x')) e=a+d 20 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Indexing As NumPy! This mean slices and index selection return view # return views, supported on GPU a_tensor [ i n t ] a_tensor [ int , i n t ] a_tensor[start:stop:step , start:stop:step] a_tensor[:: − 1 ] # reverse the first dimension # Advanced indexing, return copy a_tensor[an_index_vector] # Supported on GPU a_tensor[an_index_vector , an_index_vector] a_tensor [ int , an_index_vector] a_tensor[an_index_tensor , ...] 21 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Compiling and running expression I theano.function I shared variables and updates I compilation modes 22 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End theano.function >>> from theano import t e n s o r as T >>> x = T.scalar() >>> y = T.scalar() >>> from theano import f u n c t i o n >>> # first arg is list of SYMBOLIC inputs >>> # second arg is SYMBOLIC output >>> f = function([x, y], x + y) >>> # Call it with NUMERICAL values >>> # Geta NUMERICAL output >>> f(1., 2.) a r r a y ( 3 . 0 ) 23 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Shared variables I It's hard to do much with purely functional programming I shared variables add just a little bit of imperative programming I A shared variable is a buer that stores a numerical value for a Theano variable I Can write to as many shared variables as you want, once each, at the end of the function I Can modify value outside of Theano function with get_value() and set_value() methods. 24 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Shared variable example >>> from theano import s h a r e d >>> x = shared(0.) >>> updates = [(x, x + 1)] >>> f = function([] , updates=updates) >>> f ( ) >>> x.get_value() 1 .0 >>> x.set_value(100.) >>> f ( ) >>> x.get_value() 101.0 25 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Compilation modes I Can compile in dierent modes to get dierent kinds of programs I Can specify these modes very precisely with arguments to theano.function I Can use a few quick presets with environment variable ags 26 / 58 Introduction Compiling/Running Theano Modifying expressions Models GPU Exercises Debugging End Example preset compilation modes I FAST_RUN: default. Fastest execution, slowest compilation I FAST_COMPILE: Fastest compilation, slowest execution. No C code. I DEBUG_MODE: Adds lots of checks. Raises error messages in situations other modes regard as ne.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    61 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us