Machine Learning and Deep Learning with the Wolfram Language Jérôme Louradour - Wolfram Research [email protected] 2 2018-10-13_AIUkraine.nb

Wolfram Language

http://reference.wolfram.com/language/

In[]:=

◼ 5000+ functions ◼ High-level and Coherent ◼ Interactive notebook ◼ Polished documentation ◼ Knowledgebase access 2018-10-13_AIUkraine.nb 3

In[]:= blurSingleFaceimage_, face_ := ImageComposeimage, Blurface["Image"], 20, face"Position";

BlurFacesimage_ := FoldblurSingleFace, image, FindFacesimage, "Position", "Image";

In[]:= BlurFaces 

Out[]= 4 2018-10-13_AIUkraine.nb

In[]:=

 In[ ]:=  What are the notable people from Kiev?

Kyiv CITY  notable people born in city 

Out[]= Mila Kunis Day: Sun 14 Aug 1983 Milla Jovovich Day: Wed 17 Dec 1975 Andriy Shevchenko Day: Wed 29 Sep 1976 Golda Meir Day: Sun 15 May 1898 Kazimir Malevich Day: Sun 23 Feb 1879  → , → , → , → , → ,

Vladimir Horowitz Day: Thu 1 Oct 1903 John Demjanjuk Day: Sat 3 Apr 1920 Mikhail Bulgakov Day: Fri 15 May 1891 Vaslav Nijinsky Day: Wed 12 Mar 1890 Max Levchin Year: 1975 → , → , → , → , → ,

Alexandr Dolgopolov Day: Mon 7 Nov 1988 Elena Baltacha Day: Sun 14 Aug 1983 Louise Nevelson Day: Sun 23 Sep 1900 Yevgeny Primakov Day: Tue 29 Oct 1929 → , → , → , → ,

Irène Némirovsky Day: Tue 24 Feb 1903 Sergiy Stakhovsky Day: Mon 6 Jan 1986 Victor Pinchuk Day: Wed 14 Dec 1960 German Khan Day: Fri 26 Oct 1962 → , → , → , → ,

Viktor Khryapa Day: Tue 3 Aug 1982 Vitaly Potapenko Day: Fri 21 Mar 1975 Anatole Litvak Day: Fri 23 May 1902 Denis Kudla Day: Mon 17 Aug 1992 Ephraim Katzir Day: Mon 29 May 1916 → , → , → , → , → ,

Dimitrij Ovtcharov Day: Fri 2 Sep 1988 Dema Kovalenko Day: Sun 28 Aug 1977 Anna Sten Day: Wed 16 Dec 1908 Leonid Fedun Day: Tue 5 Apr 1955 Alex Kuznetsov Day: Thu 5 Feb 1987 → , → , → , → , → ,

Zhan Beleniuk Day: Thu 24 Jan 1991 Mariya Koryttseva Day: Sat 25 May 1985 Nikolai Kuksenkov Day: Fri 2 Jun 1989 Anastasia Grymalska Day: Thu 12 Jul 1990 → , → , → , → ,

Gleb Lozino-Lozinskiy Day: Sat 25 Dec 1909 Daryna Zevina Day: Thu 1 Sep 1994 Józef Bohdan Zaleski Day: Sun 14 Feb 1802 Tetiana Luzhanska Day: Tue 4 Sep 1984 → , → , → , → ,

Tetyana Arefyeva Day: Tue 3 Sep 1991 Mykola Suk Day: Fri 21 Dec 1945 Mikhail Morgulis Day: Wed 1 Oct 1941 Anatoly Bannik Month: Dec 1921 Vladimir Novosiad Day: Fri 12 Apr 1968 → , → , → , → , → ,

Yonnie Starr Day: Fri 11 Aug 1905 Nina Svetlanova Day: Sat 23 Jan 1932 Margaryta Pesotska Day: Fri 9 Aug 1991 Angelina Kysla Day: Fri 15 Feb 1991 → , → , → , → ,

Anissa Khelfaoui Day: Thu 29 Aug 1991 Alexander Peli Year: 1915 Pavlo Tymoshchenko Day: Mon 13 Oct 1986 Jerzy Zagórski Day: Tue 3 Dec 1907 → , → , → , → 

In[]:= Hold @ Entity"City", "Kiev", "Kiev", ""EntityProperty"City","PeopleBornInCity"

Out[]= notable people born in city Hold   2018-10-13_AIUkraine.nb 5

Machine Learning in the Wolfram Language

Tools to Train, Evaluate and Deploy models

◼ Supervised Classification Classify ◼ → Regression Predict ◼ → ◼ Unsupervised Clustering FindClusters ◼ → Dimensionality reduction DimensionReduce ◼ → Density estimation LearnDistribution, AnomalyDetection ◼ → Model Zoo

◼ Big Neural Network Repository High-level Applications

◼ Computer Vision ◼ Natural Language Processing ◼ Audio Signal Processing 6 2018-10-13_AIUkraine.nb

Applications: Computer Vision

Object Recognition

In[]:= ImageIdentify 

Out[]= Easter egg 2018-10-13_AIUkraine.nb 7

Semantic Feature Extraction

In[]:= FacialFeatures 

Out[]= happiness anger Image → , Age → 26, Gender → Male, Emotion → , Image → , Age → 37, Gender → Male, Emotion → ,

happiness neutral Image → , Age → 44, Gender → Male, Emotion → , Image → , Age → 30, Gender → Male, Emotion → ,

neutral anger neutral Image → , Age → 25, Gender → Male, Emotion → , Image → , Age → 32, Gender → Male, Emotion → , Image → , Age → 26, Gender → Male, Emotion → ,

happiness happiness Image → , Age → 43, Gender → Male, Emotion → , Image → , Age → 28, Gender → Male, Emotion → ,

happiness neutral Image → , Age → 34, Gender → Male, Emotion → , Image → , Age → 30, Gender → Male, Emotion →  8 2018-10-13_AIUkraine.nb

Art

In[]:= ImageRestyle , 

Out[]= 2018-10-13_AIUkraine.nb 9

Applications: Natural Language Processing

Question Answering

In[]:= StringTake[WikipediaData["Sergei Polunin"], 1000]

Out[]= Sergei Vladimirovich Polunin (Ukrainian: Сергій Володимирович́ Полу́нін, Serhiy Volodymyrovych Polunin; Russian: Сергей́ Владимирович́ Полунин́ , Sergey Vladimirovich Polunin; born 20 November 1989) is a Ukrainian , actor and model. As a freelance principal dancer, Polunin is guest artist at various theaters worldwide such as Royal Ballet, Sadler's Wells Theatre, , Stanislavski and Nemirovich-Danchenko Moscow Academic Music Theatre, Theatre, Teatro San Carlo and is currently permanent guest artist for the Bayerisches Staatsballet.

== Life and career == Sergei Polunin was born in , Ukrainian SSR. From the age of four to eight, he trained at a gymnastics academy, and then spent another four years at the Kiev State Choreographic Institute. His mother, Galina, moved with him to Kiev, while his father, Vladimir Polunin, worked in to support them.After Polunin graduated from the Kyiv Choreographic Academy (КДХУ) he joined the British Roy

In[]:= FindTextualAnswer[ WikipediaData["Sergei Polunin"], "What is the nationality of Sergei Polunin?", 3, "HighlightedSentence"] // Column

Sergei Polunin was born in Kherson, Ukrainian SSR.

Out[]= Polunin also holds Serbian citizenship.

Sergei Vladimirovich Polunin ( Ukrainian : Сергій Володимирович́ Полунін́ , Serhiy Volodymyrovych Polunin; 10 2018-10-13_AIUkraine.nb

Entity Recognition (and more...)

In[]:= TextContents "The flag of Ukraine is blue and yellow. In 1934 Kiev became the capital of Soviet Ukraine. The city has a density of 3,299 people/km², with a population of 2,887,974 people in July 2015 and an area of 839 km²(324 sq mi)."

String Type Position Probability Interpretation HighlightedSnippet

Ukraine Country {13, 19} 0.926602 Ukraine The flag of Ukraine is blue and yellow. In 1934 Kiev became

blue Color {24, 27} 0.97199 The flag of Ukraine is blue and yellow. In 1934 Kiev

yellow Color {33, 38} 0.989897 of Ukraine is blue and yellow . In 1934 Kiev became the

1934 Date {44, 47} 0.860395 1934 is blue and yellow. In 1934 Kiev became the capital

Kiev AdministrativeDivision {49, 52} 0.956012 Kiev, Ukraine blue and yellow. In 1934 Kiev became the capital of

Kiev City {49, 52} 0.94606 Kiev, Kiev, Ukraine blue and yellow. In 1934 Kiev became the capital of Out[]= Ukraine Country {83, 89} 0.785337 Ukraine yellow. In 1934 Kiev became the capital of Soviet Ukraine .

3,299 people/km² Quantity {118, 133} 0.8 3299 people/km2 The city has a density of 3,299 people/km² , with a population of

2,887,974 people Quantity {157, 172} 0.9 2 887 974 people with a population of 2,887,974 people in July 2015 and an area

July 2015 Date {177, 185} 0.934119 Jul 2015 of 2,887,974 people in July 2015 and an area of 839

839 km² Quantity {202, 208} 0.8 839 km2 people in July 2015 and an area of 839 km² (324 sq mi).

324 sq mi Quantity {210, 218} 0.8 324 mi2 people in July 2015 and an area of 839 km²( 324 sq mi ).

In[]:= TextContents["I have a dog. I eat an hot dog."]

String Type Position Probability Interpretation HighlightedSnippet

dog Species {10, 12} 0.541379 Infraspecies:Canis Lupus Familiaris I have a dog . I eat an hot dog Out[]= hot dog Food {24, 30} 0.8 Entity["Food", {EntityProperty["Food", "FoodType"] -> ContainsExactly[{Entity["FoodType", "Frankfurter"]}], EntityProperty["Food", "AddedFoodTypes"] -> ContainsExactly[{}]}] I have a dog. I eat an hot dog

In[]:= notablePeople = TextCases[ WikipediaData["Kiev"], "Person" → "Interpretation"]

Out[]= Abraham Ortelius Joseph M. Marshall III Aung San Suu Kyi Paul Sefchek Natalia Khoreva Aung San Suu Kyi Ptolemy Andrew Aung San Suu Kyi  , , , , , , , , ,

Paul Sefchek , Natalia Khoreva , Batu Khan , Taras Shevchenko , Cyril of Alexandria , Josef Stalin , Vitali Klitschko , Vitali Klitschko , Tsar Nicholas I , Lenin ,

Josef Stalin , Vladimir the Great , Aung San Suu Kyi , Natalia Khoreva , Mikhail Bulgakov , Viktor Yanukovych , Shakira , Mikhail Bulgakov , Valentin Boreyko ,

Martin Luther King Vladimir Horowitz Milla Jovovich Kazimir Malevich Golda Meir Alexander Markowich Ostrowski Nicholas Pritzker II Andriy Shevchenko Igor Sikorsky , , , , , , , ,  2018-10-13_AIUkraine.nb 11

In[]:= notablePeopleBornInKiev = DeleteDuplicates@SelectnotablePeople, # place of birth  === Kyiv CITY &

Out[]= Mikhail Bulgakov Vladimir Horowitz Milla Jovovich Kazimir Malevich Golda Meir Andriy Shevchenko  , , , , , 

In[]:= AssociationThreadnotablePeopleBornInKiev → EntityValuenotablePeopleBornInKiev, occupation 

Out[]= Mikhail Bulgakov { } Vladimir Horowitz { } Milla Jovovich { } Kazimir Malevich { } Golda Meir { } Andriy Shevchenko { }  → author , → pianist , → actor , → painter , → politician , → soccer player 

In[]:= TextCases WikipediaData["Kiev"],

 Mikhail Bulgakov PERSON , Vladimir Horowitz PERSON , Milla Jovovich PERSON , Kazimir Malevich PERSON , Golda Meir PERSON , Andriy Shevchenko PERSON  → "HighlightedSnippet"

Out[]= Mikhail Bulgakov  → Andrew's Church; the home of Kiev born writer, Mikhail Bulgakov ; the monument to Yaroslav the Wise, the Grand, Mikhail Bulgakov , Russian writer,

Vladimir Horowitz Milla Jovovich →  Vladimir Horowitz , classical pianist, →  Milla Jovovich , American actress,

Kazimir Malevich - →  Kazimir Malevich , pioneer of geometric abstract art and the originator of the avant garde Suprematist movement,

Golda Meir Andriy Shevchenko →  Golda Meir , Israeli politician, the fourth Prime Minister of Israel, →  Andriy Shevchenko , Ukrainian footballer

locations = TextCasesWikipediaData["Sergei Polunin"], "Location" → #String → #Interpretation &;

In[]:= locationsStats = Map[Counts, GroupBy[locations, Last → First]]  [{ }] [{ }] [{ }] Out[ ]= GeoPosition 49., 32. → Ukrainian → 3, GeoPosition 60., 100. → Russian → 5, → 1, GeoPosition 55.7603, 37.6186 → Bolshoi Theatre → 1, [{ }] [{ }] [{ }] [{ }] GeoPosition 55.75, 37.62 → Moscow → 1, GeoPosition 45.4678, 9.18861 → La Scala Theatre → 1, GeoPosition 46.63, 32.6 → Kherson → 1, GeoPosition 50.4499, 30.5507 → Kiev → 3, [{ - }] [{ }] [{ - }] [{ - }] GeoPosition 39.5, 8. → Portugal → 1, GeoPosition 50.43, 30.52 → Kyiv → 1, GeoPosition 43.0442, 88.2578 → Academy → 1, GeoPosition 51.5009, 0.177436 → Royal Albert Hall → 2, [{ - }] [{ }] [{ }] [{ }] GeoPosition 38., 97. → American → 2, GeoPosition 44.8167, 20.4594 → National Museum of Serbia → 1, GeoPosition 44., 21. → Serbian → 1, GeoPosition 55.04, 82.93 → Novosibirsk → 2, [{ }] [{ }] [{ }] [{ - }] GeoPosition 55.6432, 37.6662 → Moscow → 2, GeoPosition 49.9563, 14.5891 → Bohemian → 1, GeoPosition 46.52, 6.62 → Lausanne → 1, GeoPosition 54., 2. → British → 1, → 1

In[2]:= GeoBubbleChart[Map[Total, locationsStats], ChartLabels → Values@Map[First@*Keys, locationsStats], GeoProjection → "Equirectangular", ImageSize → Scaled[0.7]]

Out[2]= 12 2018-10-13_AIUkraine.nb

Model Zoo: Built-in Classifiers

In[]:= Classify["NotablePerson"] 

Out[]= Milla Jovovich

In[]:= Classify["Spam"][{ "Hi Bob, I'll travel to Kiev!", "You won a free travel to Kiev!" }]

Out[]= {False, True}

In[]:= Classify["Sentiment"][{ "I have a new computer", "I had to reinstall my new computer" }]

Out[]= {Positive, Negative} 2018-10-13_AIUkraine.nb 13

Model Zoo: Neural Net Repository

◼ https://resources.wolframcloud.com/NeuralNetRepository ◼ 75+ networks, growing ◼ Demo of typical use Images

In[]:= NetModel["ResNet-101 Trained on YFCC100m Geotagged Data"]

Input port: image Out[]= NetChain Output port: class  Number of layers: 43 

In[]:= position = NetModel["ResNet-101 Trained on YFCC100m Geotagged Data"] 

Out[]= GeoPosition[{50.4537, 30.5197}]

In[]:= GeoGraphics[position]

Out[]= 14 2018-10-13_AIUkraine.nb

GeoBubbleChart

NetModel["ResNet-101 Trained on YFCC100m Geotagged Data"] , {"TopProbabilities", 30}

Out[]= 2018-10-13_AIUkraine.nb 15

In[]:= colorizeimg_Image := Image Prepend ArrayResample NetModel"Colorful Image Colorization Trained on ImageNet Competition Data"img, PrependReverse@ImageDimensions@img,2 , ImageDataColorSeparateimg,"L" , Interleaving → False, ColorSpace → "LAB" 

In[]:= colorize /@  , 

 Out[ ]=  ,  16 2018-10-13_AIUkraine.nb

Text: Word Embeddings

In[]:= animals = {"Alligator", "Ant", "Bear", "Bee", "Bird", "Camel", "Cat", "Cheetah", "Chicken", "Chimpanzee", "Cow", "Crocodile", "Deer", "Dog", "Dolphin", "Duck", "Eagle", "Elephant", "Fish", "Fly"}; fruits = {"Apple", "Apricot", "Avocado", "Banana", "Blackberry", "Blueberry", "Cherry", "Coconut", "Cranberry", "Grape", "Turnip", "Mango", "Melon", "Papaya", "Peach", "Pineapple", "Raspberry", "Strawberry", "Ribes", "Fig"};

In[]:= FeatureSpacePlot[ Join[animals, fruits], FeatureExtractor → NetModel["GloVe 100-Dimensional Word Vectors Trained on Wikipedia and Gigaword 5 Data"]]

Chicken Fish Cow Duck Bird

Dog Cat Fly Deer Bear Elephant Eagle

AlligatorDolphinChimpanzee Coconut Crocodile Banana Avocado Out[]= Pineapple Cheetah Papaya Bee Ant Fig Mango Camel

Ribes CranberryApricot Melon Peach Grape Raspberry Turnip Blueberry Cherry Strawberry

Apple Blackberry 2018-10-13_AIUkraine.nb 17

Text: Contextual Word Embeddings

In[]:= NetModel["ELMo Contextual Word Representations Trained on 1B Word Benchmark"]

 Out[ ]= NetGraph 

+ SR C SR SM

SR SR C SR SM

E C M # M # M +

M M SR SM M

Inputs Outputs Input: expression ContextualEmbedding/1: matrix (size: n4 × 1024) ContextualEmbedding/2: matrix (size: n6 × 1024) Embedding: matrix (size: n8 × 1024)

In[]:= MatrixPlot /@ NetModel["ELMo Contextual Word Representations Trained on 1B Word Benchmark"]["Hello world!"]

1 500 1024 1 500 1024 1 500 1024 Out[]= ContextualEmbedding/1 1 1, ContextualEmbedding/2 1 1, Embedding 1 1  → 32 32 → 32 32 → 32 32 1 500 1024 1 500 1024 1 500 1024

In[]:= averagedElmo = Withelmo = NetModel"ELMo Contextual Word Representations Trained on 1B Word Benchmark", NetFlatten @ NetGraphelmo, ThreadingLayer[(#1+#2+#3)/3&], MapNetPort[{1,#}]&, NetInformationelmo,"OutputPortNames"→2 

 Out[ ]= NetGraph 

SR SM M

E C M # M # M SR + SR C SR SM #

M M SR C SR SM

+

Inputs Outputs Input: expression Output: matrix (size: n2 × 1024) 18 2018-10-13_AIUkraine.nb

sentences = { "Apple makes laptops", "Apple pie is delicious", "Apple juice is full of sugar", "Apple baked with cinnamon is scrumptious", "Apple reported large quarterly profits", "Apple is a large company"};

In[]:= FeatureSpacePlotsentences, FeatureExtractor → First@averagedElmo[#] &, LabelingFunction → Callout

Apple reported large quarterly profits

Apple makes laptops

Apple is a large company

Out[]=

Apple juice is full of sugar

Apple pie is delicious

Apple baked with cinnamon is scrumptious 2018-10-13_AIUkraine.nb 19

Automated Machine Learning

Example: Training a Classifier

In[]:= scrapeImages[string_] := Thread[WebImageSearch[string, "Thumbnails", MaxItems → 40] → string]

In[]:= classes = {"Bortsch", "Kapusniak", "Solianka"};

In[]:= images = Union @@ Map[scrapeImages, classes];

In[]:= {training, test} = TakeList[RandomSample[images], {80, 40}];

In[]:= RandomSample[training, 5]

 Out[ ]=  → Solianka, → Bortsch, → Bortsch, → Kapusniak, → Bortsch

In[]:= classifier = Classify[training, TimeGoal → Quantity[20, "Seconds"]]

Input type: Image Out[]= ClassifierFunction  Classes: Bortsch, Kapusniak, Solianka 

Data not in notebook; Store now »

In[]:= cm = ClassifierMeasurements[classifier, test]

Classifier: LogisticRegression Out[]= ClassifierMeasurementsObject  Number of test examples: 40 

Data not in notebook; Store now »

In[]:= cm["ConfusionMatrixPlot"] Bortsch Kapusniak Solianka

Bortsch 11 0 2 13

Out[]= Kapusniak 0 13 2 15 actual class

Solianka 3 0 9 12 14 13 13

predicted class 20 2018-10-13_AIUkraine.nb

In[]:= cm["WorstClassifiedExamples" → 5]

 Out[ ]=  → Bortsch, → Solianka, → Solianka, → Solianka, → Kapusniak

In[]:= form = FormFunction[{"image" → "Image"}, classifier[#image, "TopProbabilities"] &]

image Browse…

 Out[ ]= FormFunction  Submit

In[]:= url = CloudDeploy[form, Permissions → "Public"]

Out[]= // / / - - - - CloudObjecthttps: www.wolframcloud.com objects 560b9dbb 96fb 44b2 8f5d f982fe9406e8

In[]:= URLShorten[url]

Out[]= https://wolfr.am/yjJdU3Wl 2018-10-13_AIUkraine.nb 21

Automated Machine Learning

Feature Extraction

NominalBag 1 1

1 1 1.52 0.52. 1.52 1 0.52. NominalSequence 1 1 Text 1 1 1 1 1 In[]:= NominalVector1 2 BooleanVector 1 2 1 1 1 1 NumericalTensorSequence1 1 1 2 1 1 1 2 1 1 11 1 1 NumericalVectorSequence10.5 1 1 1 1 Location NumericalVector2 1 2 1 1 1 10.51 2 1 1 1 1 1 1 Image 1 2 1 Audio 1 1 1 NumericalSequence1 1 1 1 ComplexVector 1 1 NumericalBag Color Image3D

Hyperparameters tuning

◼ Initial set of configurations (models + hyperparameters) ◼ Experiments on small datasets ◼ Most promising configurations trained on larger datasets

In[]:= 22 2018-10-13_AIUkraine.nb

In[]:= mnist = RandomSample[ResourceData["MNIST"], 30 000];

In[]:= digitClassifier = Classify[mnist, TimeGoal → 45]

Input type: Image Out[]= ClassifierFunction  Number of classes: 10 

In[]:= ClassifierInformation[digitClassifier]

Classifier information

Data type Image Number of classes 10

Accuracy 90.9% ± 0.53% Method LogisticRegression

Single evaluation time 1.77 ms/example

Batch evaluation speed 38.1 examples/ms

Loss 0.339 ± 0.019

Model memory 373. kB

Training examples used 30 000 examples

Out[]= Training time 1 min 27 s

Learning curve

● 1.2

1.0

0.8 ● 0.6

0.4 ● ●

50 100 500 1000 5000 104 training examples used

Interactivity and user-friendliness

◼ Progress bar ◼ Interruptibility ◼ Training time specification ◼ Measurements & Learning curves 2018-10-13_AIUkraine.nb 23

Neural Networks framework

Polished High-level framework without performance sacrifices

◼ User-friendly ◼ Interactive ◼ Automatic support of variable-length sequences ◼ Repository of pretrained network ◼ Easy to do "Network surgery" ◼ Pre and Post-processing in the network ◼ Check of constraints, human-readable error messages

In[3]:= LongShortTermMemoryLayer[5, "Input" → 10] LongShortTermMemoryLayer : Specification 10 is not compatible with port "Input", which must be a n× matrix.

Out[3]= $Failed

◼ Performance ◼ MXNet back-end ◼ Multi-GPU and TensorCore support (Mixed-precision) ◼ Documentation ◼ https://reference.wolfram.com/language/tutorial/NeuralNetworksOverview.html ◼ Wolfram Support ◼ CloudDeploy 24 2018-10-13_AIUkraine.nb

Network Graph Visualisation

In[4]:= NetModel["ELMo Contextual Word Representations Trained on 1B Word Benchmark"]

Out[4]= NetGraph 

+ SR C SR SM

SR SR C SR SM

E C M # M # M +

M M SR SM M

cnn 1: NetChain

Input 3-tensor (size: n1 × 50 × 16) 1 ConvolutionLayer 3-tensor (size: n2 × 50 × 32) 2 AggregationLayer matrix (size: n2 × 32) Output matrix (size: n2 × 32) 2: AggregationLayer Parameters Function: Max Levels: 2

Ports Input: 3-tensor (size: n × 50 × 32) Output: matrix (size: n × 32) 2018-10-13_AIUkraine.nb 25

In[]:= NetModel["Wolfram FindTextualAnswer Net for WL 11.3 (Raw Model)"]

Out[]= NetGraph  M 

GR

LSTM M S C D C + D GR SR + MX SR LSTM SR D LSTM C M S

GR + SR LSTM SR

GR SR M

Inputs Outputs WordMatch: matrix (size: n1 × 3) End: matrix (size: n1 × 1) Question: string StartActivation: matrix (size: n1 × 2) Context: string EndActivation: matrix (size: n1 × 2) Start: matrix (size: n1 × 1) 26 2018-10-13_AIUkraine.nb

Example: Transfer Learning

In[]:= inception = NetModel["Inception V3 Trained on ImageNet Competition Data"]

Input port: image Out[]= NetChain Output port: class  Number of layers: 33 

In[]:= extractor = NetTake[inception, 30]

image  Out[ ]= NetChain Input 3-tensor (size: 3 × 299 × 299)  conv_conv2d ConvolutionLayer 3-tensor (size: 32 × 149 × 149) conv_batchnorm BatchNormalizationLayer 3-tensor (size: 32 × 149 × 149) conv_relu Ramp 3-tensor (size: 32 × 149 × 149) conv_1_conv2d ConvolutionLayer 3-tensor (size: 32 × 147 × 147) conv_1_batchnorm BatchNormalizationLayer 3-tensor (size: 32 × 147 × 147) conv_1_relu Ramp 3-tensor (size: 32 × 147 × 147) conv_2_conv2d ConvolutionLayer 3-tensor (size: 64 × 147 × 147) conv_2_batchnorm BatchNormalizationLayer 3-tensor (size: 64 × 147 × 147) conv_2_relu Ramp 3-tensor (size: 64 × 147 × 147) pool PoolingLayer 3-tensor (size: 64 × 73 × 73) conv_3_conv2d ConvolutionLayer 3-tensor (size: 80 × 73 × 73) conv_3_batchnorm BatchNormalizationLayer 3-tensor (size: 80 × 73 × 73) conv_3_relu Ramp 3-tensor (size: 80 × 73 × 73) conv_4_conv2d ConvolutionLayer 3-tensor (size: 192 × 71 × 71) conv_4_batchnorm BatchNormalizationLayer 3-tensor (size: 192 × 71 × 71) conv_4_relu Ramp 3-tensor (size: 192 × 71 × 71) pool1 PoolingLayer 3-tensor (size: 192 × 35 × 35) Inception1 NetGraph (23 nodes) 3-tensor (size: 256 × 35 × 35) Inception2 NetGraph (23 nodes) 3-tensor (size: 288 × 35 × 35) Inception3 NetGraph (23 nodes) 3-tensor (size: 288 × 35 × 35) Inception4 NetGraph (14 nodes) 3-tensor (size: 768 × 17 × 17) Inception5 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception6 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception7 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception8 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception9 NetGraph (20 nodes) 3-tensor (size: 1280 × 8 × 8) Inception10 NetGraph (29 nodes) 3-tensor (size: 2048 × 8 × 8) Inception11 NetGraph (29 nodes) 3-tensor (size: 2048 × 8 × 8) global_pool PoolingLayer 3-tensor (size: 2048 × 1 × 1) flatten FlattenLayer vector (size: 2048) Output vector (size: 2048)

In[]:= trainingPreprocessed = extractor[training[[All, 1]], TargetDevice → "GPU"] → training[[All, 2]];

In[]:= head = NetChain[<| "dropout" → DropoutLayer[], "lin" → LinearLayer[], "softmax" → SoftmaxLayer[] |>, "Output" → NetDecoder[{"Class", classes}] ]

Input tensor  Out[ ]= NetChain uninitialized dropout DropoutLayer tensor  lin LinearLayer vector (size: 3) softmax SoftmaxLayer vector (size: 3) Output class

In[]:= trained = NetTrain[head, trainingPreprocessed, MaxTrainingRounds → Quantity[20, "Seconds"], TargetDevice → "GPU"]

Input vector (size: 2048)  Out[ ]= NetChain dropout DropoutLayer vector (size: 2048)  lin LinearLayer vector (size: 3) softmax SoftmaxLayer vector (size: 3) Output class 2018-10-13_AIUkraine.nb 27

In[]:= netClassifier = NetJoin[extractor, trained]

image  Out[ ]= NetChain Input 3-tensor (size: 3 × 299 × 299)  conv_conv2d ConvolutionLayer 3-tensor (size: 32 × 149 × 149) conv_batchnorm BatchNormalizationLayer 3-tensor (size: 32 × 149 × 149) conv_relu Ramp 3-tensor (size: 32 × 149 × 149) conv_1_conv2d ConvolutionLayer 3-tensor (size: 32 × 147 × 147) conv_1_batchnorm BatchNormalizationLayer 3-tensor (size: 32 × 147 × 147) conv_1_relu Ramp 3-tensor (size: 32 × 147 × 147) conv_2_conv2d ConvolutionLayer 3-tensor (size: 64 × 147 × 147) conv_2_batchnorm BatchNormalizationLayer 3-tensor (size: 64 × 147 × 147) conv_2_relu Ramp 3-tensor (size: 64 × 147 × 147) pool PoolingLayer 3-tensor (size: 64 × 73 × 73) conv_3_conv2d ConvolutionLayer 3-tensor (size: 80 × 73 × 73) conv_3_batchnorm BatchNormalizationLayer 3-tensor (size: 80 × 73 × 73) conv_3_relu Ramp 3-tensor (size: 80 × 73 × 73) conv_4_conv2d ConvolutionLayer 3-tensor (size: 192 × 71 × 71) conv_4_batchnorm BatchNormalizationLayer 3-tensor (size: 192 × 71 × 71) conv_4_relu Ramp 3-tensor (size: 192 × 71 × 71) pool1 PoolingLayer 3-tensor (size: 192 × 35 × 35) Inception1 NetGraph (23 nodes) 3-tensor (size: 256 × 35 × 35) Inception2 NetGraph (23 nodes) 3-tensor (size: 288 × 35 × 35) Inception3 NetGraph (23 nodes) 3-tensor (size: 288 × 35 × 35) Inception4 NetGraph (14 nodes) 3-tensor (size: 768 × 17 × 17) Inception5 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception6 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception7 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception8 NetGraph (32 nodes) 3-tensor (size: 768 × 17 × 17) Inception9 NetGraph (20 nodes) 3-tensor (size: 1280 × 8 × 8) Inception10 NetGraph (29 nodes) 3-tensor (size: 2048 × 8 × 8) Inception11 NetGraph (29 nodes) 3-tensor (size: 2048 × 8 × 8) global_pool PoolingLayer 3-tensor (size: 2048 × 1 × 1) flatten FlattenLayer vector (size: 2048) dropout DropoutLayer vector (size: 2048) lin LinearLayer vector (size: 3) softmax SoftmaxLayer vector (size: 3) Output class

In[]:= netClassifier , "TopProbabilities"

Out[]= { } Solianka → 0.462191, Kapusniak → 0.41293, Bortsch → 0.124879

In[]:= cmNet = ClassifierMeasurements[netClassifier, test]; cmNet["ConfusionMatrixPlot"] Bortsch Kapusniak Solianka

Bortsch 12 0 1 13

Out[]= Kapusniak 0 15 0 15 actual class

Solianka 1 1 10 12 13 16 11

predicted class 28 2018-10-13_AIUkraine.nb

Neural Networks Surgery and inspection

In[]:= net = NetModel["Wolfram ImageIdentify Net for WL 11.1"]

Input port: image Out[]= NetChain Output port: class  Number of layers: 24 

In[]:= visualizeFeaturesimg_, level_ := Image /@ NetTakeNetModel"Wolfram ImageIdentify Net for WL 11.1", levelimg;

In[]:= visualizeFeatures , 5

 Out[ ]=  , , , , , , , , , , , , , , ,

, , , , , , , , , , , , , , , ,

, , , , , , , , , , , , , , , ,

, , , , , , , , , , , , , , , ,  2018-10-13_AIUkraine.nb 29

In[]:= AnimatevisualizeFeatures , level, {level, Range[22]} 30 2018-10-13_AIUkraine.nb

In[]:= filterDisplay= Image3DMapThreadImageMultiply, ColorSeparateImage#, Interleaving→False, Red,Green,Blue, ImageSize→Tiny&;

In[]:= filterDisplay /@ NetExtract[NetModel["Wolfram ImageIdentify Net for WL 11.1"], {"conv_1", "Weights"}]

 Out[ ]=  , , , , , , , , , ,

, , , , , , , , , ,

, , , , , , , , , , ,

, , , , , , , , , , ,

, , , , , , , , , , ,

, , , , , , , , , ,  2018-10-13_AIUkraine.nb 31

What’s next

Automatic Machine Learning

◼ Reversible Generative Models ◼ Few-Shot learning Neural Networks

Take-away messages

◼ The power of Transfer Learning or Why you should not need to design your network from scratch

◼ Best solutions to build application: smart combination of Machine Learning and knowledge ◼ The Grail: Mapping visual/textual/audio entities into a unique semantic space 32 2018-10-13_AIUkraine.nb

Дякую Terima Kasih Takk Skal Du Ha Snorhakalut'Yun Gomabseubnida Khx KhxbkhunPaldiesGracias Go Raibh Maith Agat AitähAsanteShkraDziękuję Ci Dhan'Yavada Cam On BanՇնորհակալությունMtshkrm DakujemDekuji תודה Þakka ÞérDakuu ありがとうございました Blagodara Cảm Ơn Bạn 고맙습니다 Dank Je Wel 'D'NqDiolch AitahTak Ďakujem Gratias Tibi Xie Xieאדאנק谢谢ﺷﻜﺮاMerci Tack Sa Mycket Out[]= Grazzi ধ�যবাদ EucharistoThank You Kiitosநன்றி Danke Obrigado Děkuji TwdhFaleminderit ﻣﺘﺸﮑﺮمध�यवाद MulţumescДякую ہHvala TiSpasibo ขอขอบคุณ Ap Ka Shkry Ευχαριστώ ﺷﮑﺮﯾﮧ ﮐﺎ آپ Grazie Aciu Спасибо Dankie Dziekuje Ci Благодаря Ačiū NanriSalamat Teşekkür Ederim Tack Så Mycket Thakka TherHvala VamХвала ВамMultumesc Arigatougozaimashita Tesekkur Ederim Dhan'Yabada

Related links

◼ Blog posts ◼ http://blog.wolfram.com/2017/10/10/building-the-automated-data-scientist-the-new-classify-and-predict/ ◼ http://blog.wolfram.com/2018/02/15/new-in-the-wolfram-language-findtextualanswer/ ◼ http://blog.wolfram.com/2018/06/14/launching-the-wolfram-neural-net-repository/ ◼ Design of the Wolfram Language on Twitch ◼ https://www.twitch.tv/stephen_wolfram/videos/all