<<

Introduction to Parallel

Elwin Chan

© 2015 The MathWorks, Inc.1 Who Uses ?

Optimizing JIT Steel Manufacturing Schedule Cut simulation time from 1 hour to 5 minutes

Heart Transplant Studies 3-4 weeks reduced to 5 days

Flight Test Data Analysis 16x Faster

Mobile Communications Technology Simulation time reduced from weeks to hours, 5x more scenarios Hedge Fund Portfolio Management Simulation time reduced from 6 hours to 1.2 hours

2 Demo: Getting Data from a Web API

3 Demo: Getting Data from a Web API using parfor

4 Demo: Getting Data from a Web API using parfeval

5 Going Parallel: Multicore

Multicore Desktop

MATLAB workers

MATLAB client

6 Going Parallel: Clusters

Computer Cluster

MATLAB client

7 Going Parallel: Cloud

Cloud Cluster

MATLAB client

8 Going Parallel: GPUs

NVIDIA GPUs

GPU Cores

MATLAB client

Device Memory

9 : Independent Tasks or Iterations parfor, parfeval, jobs and tasks

. Examples: parameter sweeps, Monte Carlo simulations . No dependencies or communications between tasks

MATLAB client

MATLAB workers

Time Time

10 Speed-up using GPUs

 Ideal Problems • and/or Vectorized operations • Computationally Intensive • consists of supported functions GPU Cores

 300+ GPU-enabled MATLAB functions

 Additional GPU-enabled Toolboxes Device Memory • Neural Networks • Image Processing • Communications • ..... Learn More

11 Too Much Data for One Machine? Use

Using More Computers (RAM)

MATLAB client RAM RAM

12 Distributed Arrays

. Distributed Arrays hold data remotely on workers running on a cluster

. Manipulate directly from client MATLAB (desktop)

. 200+ MATLAB functions overloaded for distributed arrays

Supported functions for Distributed Arrays 13 Too Much Data for Your Cluster? Use Datastore

Datastore

HDFS

Node Data

Node Data

Node Data Datastore access data stored in HDFS from MATLAB Hadoop

14 Too Much Data for Your Cluster? Use Datastore, MapReduce and tall

MATLAB Datastore HDFS

Node Data Reduce

Node Data Map Reduce

Node Data Map Reduce map.m reduce.m Hadoop

15 Offload and Scale Computations with batch

Work Worker MATLAB Desktop (Client) Worker Worker Result

Worker

batch(…)

16 Going Parallel: Other MATLAB® Toolboxes Enable parallel computing support by setting a flag or preference

Image Processing Statistics and Neural Networks Batch Image , Resampling Methods, k-Means Deep Learning, Neural Network Processing, GPU-enabled functions clustering, GPU-enabled functions training and simulation

Signal Processing and Computer Vision Optimization Communications Parallel-enabled functions Parallel estimation of GPU-enabled FFT filtering, in bag-of-words workflow gradients cross correlation, BER simulations

Other Parallel-enabled Toolboxes 17 Going Parallel: Simulink® Enable parallel computing support by setting a flag or preference

Simulink Design Optimization Simulink Control Design

Response optimization, sensitivity Frequency response estimation analysis, parameter estimation

Communication Systems Toolbox Simulink/Embedded Coder

GPU-based System objects for Generating and building code Simulation Acceleration

Other Parallel-enabled Toolboxes 18 Why use Parallel Computing Toolbox?

. Reduce computation time by GPU Cores – Using more cores

Device Memory – Accessing Graphical Processing Units

. Overcome memory limitations by – Distributing data to available hardware – Using datastore and RAM RAM

. Offload computations to a cluster and – Free up your desktop – Access better

19