Artificial Intelligence & the Future of Data-Driven Advertising

Total Page:16

File Type:pdf, Size:1020Kb

Artificial Intelligence & the Future of Data-Driven Advertising Artificial Intelligence & The Future of Data-Driven Advertising Dr. Jochen Schlosser Chief Strategy Officer [email protected] 1 Dr. Jochen Schlosser Chief Strategy Officer Adform Owner of Adform’s strategy with dedicated responsibility for Product & Technology. • Broad Paid / Rented / Owned Media Experience • Trained Data and Computer Scientist with minor degree in psychology • Happy to have survived 10+ years in Advertising General Introduction What is Artificial Intelligence? A computer program which has the ability to think, solve problems, learn, set goals, and create. It is the science and engineering of making intelligent machines, especially intelligent computer programs. 4 - John McCarthy, 2002 Deep, Machine, Intelligence Learning? Deep Learning Machine Learning Artificial Intelligence Subarea of ML Subarea of AI Highly Simplified Where multi-layered Neural Algorithms that enable machines to A program that can “sense”, reason, Networks learn based on make decisions and improve by act, and adapt. high volumes of data learning over time. 5 Brief Timeline Deep Learning Machine Learning Artificial Intelligence 1965 1805-1957 Since 2000s - GAME CHANGER ”Birth”* of deep learning Algorithm groundwork and first self- Cost of computations and storage drop (Ukraine) learning algorithm * Principle of automatic models generation, inconclusive decisions, and selection by external criteria by Alexey Ivakhnenko 6 Keeping the Perspective That is what we mostly do Accomplishment of one or a very few goals NARROW AI 7 SUPERVISED Learning 8 UNSUPERVISED Learning 9 Some examples on how that works today What exactly is this about? What exactly is this about? It is about GO and MOVE 37 AI vs HI - Humans still are still able to win!!! A Major Challenge ExampleThe TRANSPARENCY on TRUST buildingProblem Even if the outcome is correct, the path to it might not necessarily be the one you expected. The more HIV prediction for medication combinatorics A,B,C or D,A,Z complex the algorithm, the more difficult to explain in human terms, the less we initially do trust them. The TRAINING Problem There is no such thing as free lunch and data plus “the lab” to be trained in needs to be funded. If the lab can’t be simulated (as in Gaming), the question of money is - to no surprise - connected to real budgets. Trust me - I am a scientist (algorithm) Prof. Dr. Dr. Thomas Lengauer The AI (r)evolution will take time More than 1 million predictions for HIV drug resistance since ... Dr. degree in Mathematics (FU Berlin) Ph.D. in Computer Science (Stanford) To the real world and some opportunities GETWhat BETTER Is AI? Insurance Example 23 INNOVATIONWhat Is AI?Irrelevant Treatments / Leakage Detection 24 AI in Advertising But how is this sensor data turned into something useful? The Internet: Trillions of signals Adform’s Artificial Intelligence Trading Algorithms Forecasting Dynamic Ads Product Retargeting Audience Extension Cross-Device Fraud Detection Smart Throttling WORKING DRAFT How Adform Trading Algorithms use AI Adform example of AI Example of mined consumer interaction patterns Exemplary bid change based on random consumer pattern Consumer journey1 Conv. likelihood, % Delta Bid (CPM) Electronic Impression (1 sec, 53%) 1.03 1.03 0.10 EUR News Impression (4 sec, 35%) 2.31 1.28 0.23 EUR Video view (completed) 10.32 8.01 1.03 EUR Campaign click 18.43 8.11 1.84 EUR Product Search Time on site (90 sec) 21.48 3.05 2.15 EUR Publisher’s website Publisher’s Price Comparison Search click (product search) 43.98 22.50 4.40 EUR Time on site (3 sec) 36.98 - 7.00 3.70 EUR Impression (1 sec, 15%) 34.65 - 2.33 3.47 EUR Product Details Visited comparison site 56.67 22.02 5.67 EUR Product Information Social click 60.16 3.49 6.02 EUR App Sale Sign up for offers 94.60 34.44 9.46 EUR Download Offer Advertiser’s Advertiser’s website Signup Video view (completed) 94.98 0.38 9.50 EUR Non-value adding signal Value adding signal Customer Journey Category In-banner engagement 97.83 2.85 9.78 EUR Billions of signals are mined for patterns Each interaction1 influences maximum bid price 1) Hundreds of parameters or combinations of these are used to determine the conversion likelihood, e.g., Domain, URL, Time of Day, Date, Weekday, Category, Browser, Format, Banner Interactions, Viewability, Time on Site, Channel, Campaign, Media Type, … 28 WORKING DRAFT How Adform Trading Algorithms use AI Adform example of AI Example of mined consumer interaction patterns Exemplary bid change based on random consumer pattern Consumer journey1 Conv. likelihood, % Delta Bid (CPM) Electronic Impression (1 sec, 53%) 1.03 1.03 0.10 EUR News Impression (4 sec, 35%) 2.31 1.28 0.23 EUR Video view (completed) 10.32 8.01 1.03 EUR Campaign click 18.43 8.11 1.84 EUR Product Search Time on site (90 sec) 21.48 3.05 2.15 EUR Publisher’s website Publisher’s Price WHAT IF MORE Comparison Search click (product search) 43.98 22.50 4.40 EUR Time on site (3 sec) MORE36.98 - 7.00 3.70 EUR Impression (1 sec, 15%) 34.65 - 2.33 3.47 EUR Product DATA IS FED Details Visited comparison site 56.67 22.02 5.67 EUR Product INTO THE SYSTEM Information Social click 60.16 3.49 6.02 EUR App Sale Sign up for offers 94.60 34.44 9.46 EUR Download Offer Advertiser’s Advertiser’s website Signup Video view (completed) 94.98 0.38 9.50 EUR Non-value adding signal Value adding signal Customer Journey Category In-banner engagement 97.83 2.85 9.78 EUR Billions of signals are mined for patterns Each interaction1 influences maximum bid price 1) Hundreds of parameters or combinations of these are used to determine the conversion likelihood, e.g., Domain, URL, Time of Day, Date, Weekday, Category, Browser, Format, Banner Interactions, Viewability, Time on Site, Channel, Campaign, Media Type, … 29 WORKING DRAFT How Adform Trading Algorithms use AI Adform example of AI Example of mined consumer interaction patterns Exemplary bid change based on random consumer pattern Consumer journey1 Conv. likelihood, % Delta Bid (CPM) Impression (1 sec, 53%) 1.03 1.03 0.10 EUR Social Clicks Impression (4 sec, 35%) 2.31 1.28 0.23 EUR Video view (completed) 10.32 8.01 1.03 EUR Campaign click 18.43 8.11 1.84 EUR Product Search Time on site (90 sec) 21.48 3.05 2.15 EUR Publisher’s website Publisher’s Price Comparison Search click (product search) 33.98 12.50 3.40 EUR Time on site (3 sec) 26.98 - 7.00 2.70 EUR Impression (1 sec, 15%) 24.65 - 2.33 2.47 EUR Product Details Visited comparison site 36.67 12.02 3.67 EUR Product Information Social click 60.16 23.49 5.02 EUR Net Sign up for offers 92.60 34.44 8.46 EUR Offer Sale Advertiser’s Advertiser’s website Signup Video view (completed) 92.98 0.38 8.50 EUR Non-value adding signal Value adding signal Customer Journey Category In-banner engagement 97.83 2.85 8.78 EUR Billions of signals are mined for patterns Each interaction1 influences maximum bid price 1) Hundreds of parameters or combinations of these are used to determine the conversion likelihood, e.g., Domain, URL, Time of Day, Date, Weekday, Category, Browser, Format, Banner Interactions, Viewability, Time on Site, Channel, Campaign, Media Type, … 30 What’s next? The Future The New Marketing Human Natural Language Processing Voice Synthesis Classification & Offer Optimization Computer Vision Emotional Intelligence 33 Adform POC with Amazon Alexa API Access to user microphone HTTPS Security and Consent must be in place for cam Cloud speech API enables Alexa functionality in Ads Alexa environment test interface pulled in through iFrame with added events 34 NEWS from the past … The Intelligent Personal Assistant will be compatible with other digital voice assistants, thereby providing a link to other rapidly growing ecosystems. 36 Good morning, you are driving by a local VF store, with free parking in 5 minutes, would you like to pickup your new iPhone? Done ... While being there, you can also upgrade your contract and pickup the new GC which you showed interest in some weeks ago. Exactly this one! There is a 4 week free trial which gives you access to it during your vacation in 2 weeks. 37 Machine-to-Machine Marketing Which Leaves Us… The Value of Human Intelligence Let’s CREATE something NEW [email protected].
Recommended publications
  • Neural Networks, Biometrics and the New Artificial Intelligence Mathias Fuchs & Ramón Reichert 5
    ISBN: 2838 Digital Vol. 4, Issue 1/2018 Ramón Reichert, Mathias Fuchs, Pablo Abend, Annika Richterich, Karin Wenz (eds.) Rethinking AI: Neural Networks, Biometrics and the New Artifical Intelligence Ramón Reichert, Annika Richterich (eds.) Digital Material/ism The journal is edited by Annika Richterich, Karin Wenz, Pablo Abend, Mathias Fuchs, Ramón Reichert Editorial Board Maria Bakardjieva, Brian Beaton, David Berry, Jean Burgess, Mark Coté, Colin Cremin, Sean Cubitt, Mark Deuze, José van Dijck, Delia Dumitrica, Astrid Ensslin, Sonia Fizek, Federica Frabetti, Richard A. Grusin, Orit Halpern, Irina Kaldrack, Wendy Hui Kyong Chun, Denisa Kera, Lev Manovich, Janet H. Murray, Jussi Parikka, Lisa Parks, Christiane Paul, Dominic Pettman, Rita Raley, Richard Rogers, Julian Rohrhuber, Marie-Laure Ryan, Mirko Tobias Schäfer, Jens Schröter, Trebor Scholz, Tamar Sharon, Roberto Simanowski, Nathaniel Tkacz, Nanna Verhoeff, Geoffrey Winthrop-Young, Sally Wyatt Bibliographic information published by the Deutsche Nationalbibliothek The Deutsche Nationalbibliothek lists this publication in the Deutsche National bibliografie; detailed bibliographic data are available on the Internet at http://dnb.d-nb.de © 2018 transcript Verlag, Bielefeld All rights reserved. No part of this book may be reprinted or reproduced or utilized in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publisher.
    [Show full text]
  • Assessment of the Modeling Abilities of Neural Networks Can Be Formed
    ASSESSMENT OF THE MODELING ABILITIES OF NEURAL NETWORKS by Alvin Ramsey Submitted to the Department of Mechanical Engineering in Partial Fulfillment of the Requirements for the Degree of Master of Science in Mechanical Engineering at the Massachusetts Institute of Technology January 1994 © Massachusetts Institute of Technology 1994 All rights reserved Signature of Author .. lDpartment of Mechanical Engineering January 1994 /t / I I Certified by --- George Chryssolouris -~~ 1 I ,'4ro . essor Thesis Supervisor l l V~rto Accepted by - ............- ... -'-- Professor Ain A. Sonin i A)-:S1'!!3NSr Graduate Committee OFTFrHR A 5,,'RARIE DEDICATION To my mother ( o o3 ) and to my brother ( Bro, on my word, we shall go fishing someday! ). 2 ASSESSMENT OF THE MODELING ABILITIES OF NEURAL NETWORKS by Alvin Ramsey Submitted to the Department of Mechanical Engineering on January 1994 in partial fulfillment of the requirements for the Degree of Master of Science in Mechanical Engineering ABSTRACT The treatment of manufacturing problems, whether in process control, process optimization, or system design and planning, can be helped by input-output models, namely, relationships between input and output variables. Artificial neural networks present an opportunity to "learn" empirically established relationships and apply them subsequently in order to solve a particular problem. In light of the increasing amount of applications of neural networks, the objective of this thesis is to evaluate the ability of neural networks to generate accurate models for manufacturing applications. Various neural network models has been tested on a number of "test bed" problems which represent the problems typically encountered in manufacturing processes and systems to assess the reliability of neural network models and to determine the efficacy of their modeling capabilities.
    [Show full text]
  • A Depth of Deep Learning for Big Data and Its Applications
    Special Issue - 2020 International Journal of Engineering Research & Technology (IJERT) ISSN: 2278-0181 ENCADEMS - 2020 Conference Proceedings A Depth of Deep Learning for Big Data and its Applications Abhay Narayan Tripathi, Bharti Sharma Research Scholar, DIT Univeristy, Assistant Professor, DIT University Dehradun, Uttarakhand Dehradun,Uttrakhand ABSTRACT:- Although Machine Learning (ML) has become computational power and advances in data science. DL’s inherent synonymous for Artificial Intelligence (AI); recently, Deep ability to discover correlations from vast quantities of data in an Learning (DL) is being used in place of machine learning unsupervised fashion has been the main drive for its wide adoption. persistently. While machine learning is busy in supervised and Deep Learning also enables dynamic discovery of features from unsupervised methods, deep learning continues its motivation for data, unlike traditional machine learning approaches, where feature replicating the human nervous system by incorporating advanced selection remains a challenge. Deep Learning has been applied to types of Neural Networks (NN).. If we apply Deep Learning to various domains such as speech recognition and image Big Data, we can find unknown and useful patterns that classification, nature language processing, and computer vision. were impossible so far. Deep Learning is applied in self Typical deep neural networks (DNN) require large amounts of data driving cars, visual recognition, healthcare, transportation to learn parameters (often reaching to millions), which is a etc. Nowadays, companies have started to realize the importance computationally intensive process requiring significant time to train of data availability in large amounts in order to make the correct a model. As the data size increases exponentially and the deep decision and support their strategies.
    [Show full text]
  • Almond Variety Detection Using Deep Learning
    Almond Variety Detection using Deep Learning Benarous Ahmed Omar Farouq - a41369 Thesis presented to the School of Technology and Management in the scope of the Master in Information Systems. Supervisors: Prof. Maria João Tinoco Varanda Pereirar This document does not include the suggestions made by the board. Bragança 2019-2020 Dedication I dedicate my dissertation work to my family. A special feeling of gratitude to my loving parents, whose words of encouragement and push for tenacity ring in my ears. I also dedicate this dissertation to my many friends who have supported me throughout the process. I will always appreciate all they have done. v Acknowledgment In the name of Allah, the Most Gracious, the Most Merciful I would like to express my deep sense of gratitude, respect and heartfelt thanks to Prof. Maria João Tinoco Varanda Pereira, for her great contribu- tions, encouragement and support during the thesis writing, understanding and patience that have accompanied me during the research time. I would like to acknowledge and thank to ESTIG-IPB school division for providing me of the necessary materials needed to conduct my research and providing any assistance requested. I would like to acknowledge and thank to ESA-IPB school division for allow- ing me to conduct my research and providing any assistance requested. Special thanks goes to the members of the lab Professors Nuno Rodrigues and José Alberto Pereira for their continued support. Finally I would like to thank all professor of the hassiba benbouali university where i have started my journey of studying computer science. And special thank and acknowledgment to all community of computer science, this dynamic and healthy community is the reason why computer science, which is very new vi field compared to other sciences, has evolved so dramatically fast and now have huge impact on almost everything.
    [Show full text]
  • Deep Learning: State of the Art (2020) Deep Learning Lecture Series
    Deep Learning: State of the Art (2020) Deep Learning Lecture Series For the full list of references visit: http://bit.ly/deeplearn-sota-2020 https://deeplearning.mit.edu 2020 Outline • Deep Learning Growth, Celebrations, and Limitations • Deep Learning and Deep RL Frameworks • Natural Language Processing • Deep RL and Self-Play • Science of Deep Learning and Interesting Directions • Autonomous Vehicles and AI-Assisted Driving • Government, Politics, Policy • Courses, Tutorials, Books • General Hopes for 2020 For the full list of references visit: http://bit.ly/deeplearn-sota-2020 https://deeplearning.mit.edu 2020 “AI began with an ancient wish to forge the gods.” - Pamela McCorduck, Machines Who Think, 1979 Frankenstein (1818) Ex Machina (2015) Visualized here are 3% of the neurons and 0.0001% of the synapses in the brain. Thalamocortical system visualization via DigiCortex Engine. For the full list of references visit: http://bit.ly/deeplearn-sota-2020 https://deeplearning.mit.edu 2020 Deep Learning & AI in Context of Human History We are here Perspective: • Universe created 13.8 billion years ago • Earth created 4.54 billion years ago • Modern humans 300,000 years ago 1700s and beyond: Industrial revolution, steam • Civilization engine, mechanized factory systems, machine tools 12,000 years ago • Written record 5,000 years ago For the full list of references visit: http://bit.ly/deeplearn-sota-2020 https://deeplearning.mit.edu 2020 Artificial Intelligence in Context of Human History We are here Perspective: • Universe created 13.8 billion years ago • Earth created 4.54 billion years ago • Modern humans Dreams, mathematical foundations, and engineering in reality. 300,000 years ago Alan Turing, 1951: “It seems probable that once the machine • Civilization thinking method had started, it would not take long to outstrip 12,000 years ago our feeble powers.
    [Show full text]
  • Skoltech Annual Report 2019
    Annual Report Contents FOREWORD 002 YEAR IN NUMBERS 005 01 / FOCUS & GOVERNANCE 008 Human Capital 010 Institutional Governance & Development 030 Strategic Communications 038 02 / ACADEMIC & TECHNOLOGY EXCELLENCE 058 Teaching & Learning 062 Research 072 Integrating Innovation 088 03 / VALUE GENERATION 094 Industry Funded Research 096 New Enterprises & Technology Licensing 100 Professional Training & Advisory Services 102 04 / OPERATIONAL MANAGEMENT 108 GLOSSARY 112 1 It is my pleasure and a privilege to present international high-tech companies, research the annual report. Established in 2011 with and innovation agencies, in total of 1.6 bln the vision of being a world class university, rubles – 45% growth compared to 2018. Skoltech proudly marks 2019 as the year of Over 2 bln rubles of sponsored research success. funding have been confirmed for 2020 2019 was the first year in the new and beyond. Long-term investments have campus. Awarded at UNESCO headquarters been secured by Skoltech joint laboratories with the Prix Versailles as the world's best with Oerlikon, Gazprom Neft, Huawei, campus in 2019, it is without doubt an Sberbank, and Topcon. Also, Sberbank joined architectural masterpiece. At the same time, Skoltech as a founder. I am confident that the most important aspect of the campus this will bring new technological tasks and is its unique atmosphere and infrastructure opportunities for student internships and which inspire and enable our commitment for graduates’ employment. excellence. Staffed with top-notch professionals, the The high research performance of the Center of the Internet of Things contributes faculty, researchers, engineers, and students to the National Technology Initiative. The has put Skoltech into the top-100 Nature 5G Open Radio Access Network Center, Index 2019 Young Universities.
    [Show full text]
  • A Timeline of Artificial Intelligence
    A Timeline of Artificial Intelligence by piero scaruffi | www.scaruffi.com All of these events are explained in my book "Intelligence is not Artificial". TM, ®, Copyright © 1996-2019 Piero Scaruffi except pictures. All rights reserved. 1960: Henry Kelley and Arthur Bryson invent backpropagation 1960: Donald Michie's reinforcement-learning system MENACE 1960: Hilary Putnam's Computational Functionalism ("Minds and Machines") 1960: The backpropagation algorithm 1961: Melvin Maron's "Automatic Indexing" 1961: Karl Steinbuch's neural network Lernmatrix 1961: Leonard Scheer's and John Chubbuck's Mod I (1962) and Mod II (1964) 1961: Space General Corporation's lunar explorer 1962: IBM's "Shoebox" for speech recognition 1962: AMF's "VersaTran" robot 1963: John McCarthy moves to Stanford and founds the Stanford Artificial Intelligence Laboratory (SAIL) 1963: Lawrence Roberts' "Machine Perception of Three Dimensional Solids", the birth of computer vision 1963: Jim Slagle writes a program for symbolic integration (calculus) 1963: Edward Feigenbaum's and Julian Feldman's "Computers and Thought" 1963: Vladimir Vapnik's "support-vector networks" (SVN) 1964: Peter Toma demonstrates the machine-translation system Systran 1965: Irving John Good (Isidore Jacob Gudak) speculates about "ultraintelligent machines" (the "singularity") 1965: The Case Institute of Technology builds the first computer-controlled robotic arm 1965: Ed Feigenbaum's Dendral expert system 1965: Gordon Moore's Law of exponential progress in integrated circuits ("Cramming more components
    [Show full text]
  • Neurones, Les Intelligences Simulées
    Centre Pompidou Centre Dossier de presse Neurones, Direction de la communication et du numérique centrepompidou.fr les intelligences simulées 26 février – 20 avril 2020 Dans le cadre de Mutations / Créations #4 Neurones, les intelligences simulées 26 février – 20 avril 2020 Galerie 4, Niveau 1 Centre Pompidou Centre Dans le cadre de Mutations / Créations #4 Dossier Sommaire de presse p. 3 Direction de la communication Neurones, les intelligences simulées etDirection du numérique de la communication À propos de l’exposition et du numérique 75191 Paris cedex 04 centrepompidou.fr Démystifier l’intelligence artificielle p. 4 Directrice Agnès Benayer Entretien avec Frédéric Migayrou T. 00 33 (0)1 44 78 12 87 [email protected] Neurones, les intelligences simulées p. 5 Attaché de presse Le parcours de l’exposition Timothée Nicot Artistes exposés T. 00 33 (0)1 44 78 45 79 [email protected] Section 1 p. 6 - 7 centrepompidou.fr L’objet-cerveau #ExpoNeurones #MutationsCreations Section 2 p. 8 - 9 L’intelligence des jeux Section 3 p. 10 - 12 Cyberzoo Section 4 p. 13 - 16 Consciences augmentées Section 5 p. 17 - 19 Arbres et réseaux Publication p. 20 Informations pratiques p. 21 Dossier de presse Dossier de presse Neurones, les intelligences simulées les intelligences simulées Neurones Neurones 26 février – 20 avril 2020 Galerie 4, Niveau 1 À l’heure où l’intelligence artificielle s’étend à tous les domaines du monde contemporain, le Centre Pompidou propose pour la première Commissaires fois, avec « Neurones, les intelligences simulées », une mise en Frédéric Migayrou, Directeur adjoint du Musée national d’art relation de ce phénomène avec l’histoire des neurosciences et de la moderne-Centre de Création Industrielle, neuro-computation.
    [Show full text]
  • Deep Learning : ! Nouveaux Apports!
    Deep Learning : ! nouveaux apports! Christian Wolf! Université de Lyon, INSA-Lyon! LIRIS UMR CNRS 5205! 10 Mars, 2016! 1! 2! Data interpretation! ="?" 3! Friends and foes! 4! Where am I?! Visual landmarks! 5! « What is a cat? »! ! Children automatically learn to recognize objects from shapes and visual input given examples.! Human learning is mixed supervised / unsupervised! 6! Learning to predict! {dog, cat, pokemon, helicopter, …} {A. Turing, A. Einstein, S. Cooper, …} {0, 1, … 24, 25, 26, …, 98, 99, …} We would like to predict a value t from an observed input ! ! ! Parameters are learned from training data.! 7! A biological neuron ! Devin K. Phillips! 8! Recognition of complex activities through unsupervised learning of structured attribute models April 30, 2015 Abstract Complex human actionsArtificial are modelled as Neural graphs over basic networks attributes and! their spatial and temporal relationships. The attribute dictionary as well as the graphical structure are learned automatically from training« data. Perceptron »! D y(x, w)= wixi Xi=0 1 The activity model A set = c ,c ,...,c of C complex activities is considered, where each activity c is C { i 2 C } i modelled as a Graph i =(Vi,Ei), and both, nodes and edges of the graphs, are valued. For convenience, in the followingG we will drop the index i and consider a single graph for a given activity. The set of nodes V of the graph is indexed (here denoted by j)andcorrespondsto occurrences of basic attributes. EachG node j is assigned a vector of 4 values a ,x ,y ,t : { j j j j} the attribute type aj and a triplet of spatio-temporal coordinates xj,yj and tj.
    [Show full text]
  • Download Mobile App
    www.european-mrs.com CONFERENCE PROGRAMME 16 - 19 September 2019 Fall Meeting E-MRS CONFERENCE PROGRAMME - FALL MEETING 2019 Technical editor: Agnieszka Rytel Cover design and maps: Jarosław Pura General layout and DTP: Małgorzata Zielińska www.european-mrs.com E-MRS Fall Meeting Office: Faculty of Materials Science and Engineering Warsaw University of Technology ul. Wołoska 141, 02-507 Warsaw +48 22 234 87 94 [email protected] th th 16 – September 2019 Welcome We are pleased to welcome you to the E-MRS 2019 Fall Meeting. The European Materials Research Society (E-MRS) was established in 1983 through the initiative of individual european materials scientists. A number of european materials scientists who attended the MRS meetings in the U.S.A. realised that such a society could be of benefit to Europe to enhance the links between materials sci-ence and industry and to provide a voice for the materials community. Both the E-MRS Spring and Fall Meetings provide an international forum to discuss recent advances in the field of materials science. This year the conference is again being held in the Central Campus of the Warsaw University of Technology and is the 18th E-MRS Fall Meeting to be held in Warsaw since 2002. The conference will consist of: • 22 parallel symposia, • A plenary session which includes the presentation of the Jan Czochralski Award, • An Exhibition of products and services of interest to the conference participants. - 3 - Warsaw University of Technology Campus Overview - 4 - 16 th – 19 th September 2019 Main Building - 5
    [Show full text]
  • Deep Learning for Computer Graphics Introduction / Overview
    CS 6501: Deep Learning for Computer Graphics Introduction / Overview Connelly Barnes Outline • Course info: • Website • Goals / topics • Grading • Sign up sheet • Background: machine learning, neural networks, computer vision • Recent impact on computer graphics • Examples of papers we will cover Course Website http://www.connellybarnes.com/work/class/2016/deep_learning_graphics/ • Other ways to find the website: • Linked to from UVa Collab • Linked to from Lou’s list • Linked to from my website connellybarnes.com Course Goals • Learn math and computer science behind deep (artificial) neural networks • Understand in a general way how deep learning has been applied to areas such as natural language processing, robotics, games, finance • Understand in moderate depth how deep learning is used in computer vision • Understand in greater depth how deep learning is used in graphics • Gain experience building your own deep networks for visual computing problems (computer graphics, computer vision) Course Topics (More Detail) • Neural networks • Fully connected • Convolutional • Recurrent • Autoencoders • How to initialize • How to train • How to prevent problems: e.g. vanishing/exploding gradients Grading • Quizzes in class (15%) • Student paper presentations (20%) • Programming assignments (35%) • Final course project (30%) • No mid-term Sign Up Sheet • Everyone sign up for student presentation of a paper • Also enter your username if you do not have CS department account • Obtain sign up sheet URL from Collab. What is Machine Learning? • “The field of study that gives computers the ability to learn without being explicitly programmed.” – Arthur Samuel What is Machine Learning? • 3 broad categories: • Supervised learning: computer presented with example inputs and desired outputs by a “teacher”, goal is to learn general rule that maps inputs to outputs.
    [Show full text]
  • Multivariate Time Series Pattern Recognition Using Machine
    MULTIVARIATE TIME SERIES PATTERN RECOGNITION USING MACHINE LEARNING AND DEEP LEARNING METHODS Thesis Submitted to the Graduate Faculty of The University of Texas at Arlington in partial fulfillment of the requirements of the degree of Master of Science in Industrial Engineering By Sai Abhishek Devar B.Tech. Manipal Institute of Technology, 2016 December 2019 i Copyright @ by Sai Abhishek Devar 2019 All Rights Reserved ii Acknowledgments I would like to thank my advisor, Dr. Shouyi Wang, for his invaluable guidance and motivation throughout this work. I would also like to thank Dr. Kan Chen and Dr. Yuan Zhou for serving on my thesis defense committee. A special thanks to my family and friends for their unconditional love and support. iii Table of Content Acknowledgments .................................................................................................... iii Abstract .................................................................................................................... ix Chapter-1 .................................................................................................................... 1 Introduction ............................................................................................................. 1 1.1 Multivariate time series data ............................................................................. 2 1.2 Literature Review of Machine-learning & Deep-learning ............................... 2 Chapter-2 ...................................................................................................................
    [Show full text]