The Oxford Spinout Company Using AI to Diagnose Heart Disease by Stuart Gillespie
Total Page:16
File Type:pdf, Size:1020Kb
The UniversityUniversity of of Oxford’s Oxford guideGuide to ArtificialArtificial Intelligence Intelligence #OXFORDAI | 1 Welcome By Chas Bountra Ask our academics why Oxford is at the forefront of AI research and they’ll invariably say one of two things: opportunity for collaboration, and diversity of thought. Not only do our researchers come from all over the world and Chas Bountra Pro-Vice-Chancellor from a variety of backgrounds, they’re also working across the full (Innovation), and Professor spectrum of the transformative field we call artificial intelligence. of Translational Medicine, Nuffield Department And amid the undoubted hype, these technologies genuinely of Medicine will be transformative. In this publication you’ll read about how AI and machine learning techniques are allowing clinicians to diagnose more accurately conditions such as heart disease; how the banking and finance sectors are being revolutionised by algorithms; how we’re moving towards a world in which vehicles are able to drive themselves. You’ll also read about the fundamental scientific research underpinning these world-changing applications – research carried out by mathematicians, statisticians, computer scientists and information engineers. Finally, you’ll hear the views of key voices in the ethical, social and legal debates that inevitably arise alongside rapid technological advancement. How do we know the algorithms making decisions about our lives aren’t biased? What is the likely impact of automation on jobs? Will we ever see the day when machines can truly think like humans? And this selection only scratches the surface of Oxford’s work in these areas. Working across disciplines, together with industry and government, with funders, third sector partners and colleagues at other universities, Oxford’s world-leading researchers are perfectly placed to tackle the challenges and exploit the opportunities of the AI revolution. Discover #OxfordAI here. 2 | UNIVERSITY OF OXFORD : OX.AC.UK/AI Contents INTRODUCING AI APPLICATIONS The two routes Analysing the Automating Making healthcare A game changer to AI shape of data science smarter for drug discovery PAGE 5 PAGE 6 PAGE 7 PAGE 8 PAGE 10 Biomedical Diagnosing Autonomous Transforming the Saving the planet intelligence heart disease vehicles finance sector PAGE 12 PAGE 13 PAGE 15 PAGE 16 PAGE 18 ETHICS, SOCIETY AND GOVERNANCE Robots thinking Automation and Think about Outwitting the A flourishing fast and slow the future of work ethics now ‘junk news’ nation future PAGE 21 PAGE 22 PAGE 25 PAGE 26 PAGE 27 The need for a Are algorithms The Futuremakers legal framework biased? podcast PAGE 29 PAGE 30 PAGE 31 For further information on Oxford’s AI research or working with Oxford, visit ox.ac.uk/ai or contact [email protected] #OXFORDAI | 3 Photo: shutterstock.com Photo: 4 | UNIVERSITY OF OXFORD : OX.AC.UK/AI INTRODUCING AI An early English to Russian The two routes to translation program is said to artificial intelligence have translated the sentence ‘The spirit was willing but the By Mike Wooldridge flesh was weak’ as ‘The vodka was good but the meat was bad’. AI is hard – I hope we can all agree about that. AI is hard for many reasons, but one of the most important is that it is a subject which is hard to approach directly. We have evidence that intelligent behaviour is with rules defining the grammar, how to extract the exciting thing is, at present, there is a possible – we provide that evidence – but the the meaning from the structure of the text, and lot of progress with this approach. The very processes that lead to intelligent behaviour are then how to generate a text from the meaning. big disadvantage of this approach is that it hidden from view, inside our brains. We can’t So, researchers busily worked on all of these is opaque. Your program may learn to do its examine these processes directly, and so when problems – for decades. Ever more elaborate task better than a human, but it can’t tell you we try to create intelligent behaviour, we have to ways of capturing text structure and meaning how it does it, and ultimately the expertise of start from a blank slate. were developed, and there was progress. But the program is encoded in (very long) lists of So, how do AI researchers go about building translation using these approaches never numbers – and nobody has any idea how to tell systems capable of intelligent behaviour? achieved human-level or anything like it. what those numbers mean. There are basically two types of approach, The problem is, human languages are This conundrum is at the heart of and one of these has been shown to be complicated and messy – they simply resist contemporary AI. We can have transparency dramatically successful over recent years. precise attempts to define their syntax and or competency, but at present it seems we Let’s suppose we want to write a program semantics, and are so full of subtleties, quirks can’t have both. And that is one of the biggest that can translate texts from English to French. and exceptions that they are seemingly challenges for AI research today. Not very long ago, programs that could do this impossible to nail down. competently were firmly in the realm of science In the 1990s, another idea began to fiction, and progress in automated translation gain prominence, called statistical machine was so slow that it was something of a cruel translation. Remarkably, with this approach inside joke for the AI community. there is no attempt to construct any kind of Famously an early English to Russian model or understanding of the language in translation program is said to have translated question. Instead, what we do is start with a large the sentence ‘The spirit was willing but the flesh number of examples of what we are trying to do was weak’ as ‘The vodka was good but the meat (translated texts), and we use statistical methods was bad’. Whether or not the story is true (it to learn the probability of particular translations. isn’t), it has an inner truth: machine translation The basic maths behind this approach is simple, programs were plagued with problems, routinely but to make the approach work in practice making blunders in translation that a child required lots of data, and lots of processing time would not make. to compute the statistical associations. The main approach to machine translation, Statistical machine translation achieved which was the dominant approach until this remarkable successes very quickly, and the century, was what we might call model-based. field was turned on its head. And the same With this approach, what we do is try to come up ideas began to be applied in other areas. Other with a model of the behaviour we are trying to learning techniques were investigated and reproduce, and to give that model to a computer found to work – one of them being deep learning, so that it can use it directly. For English to French which is the hub of all the present excitement translation, the models in question would be about AI. models of the English and French languages. So there, in a nutshell, are the two basic First, we would define the structure of approaches to AI. With the first, you aim to create sentences (technically, the ‘syntax’ – what a model of the thing you are trying to achieve, makes a grammatically acceptable English and and give that model to a computer. This has the French sentence and text). We then use that advantage of being transparent – we can look syntax to understand the structure of the text at the model and see what is going on. But for for translation, and hopefully from that we can complex, real-world problems, coming up with a derive the meaning of the text (the ‘semantics’). practically useful model may be impossible. Once we have that meaning, we can again go back With the second, you don’t worry about a to our model of the target language and construct model – you simply throw lots of examples a corresponding text from the meaning. of the behaviour you are trying to create at Mike Wooldridge This approach to natural language a machine learning program, and hope that Professor of Computer Science, understanding requires us to be able to come up the program learns the right thing to do. And Department of Computer Science Photo: shutterstock.com Photo: #OXFORDAI | 5 Analysing the shape of data Datasets such as point clouds, networks or images that are By Heather Harrington and Ulrike Tillmann generated often have vital information encoded in the shape of complex features. As the world is increasingly overwhelmed with data, the simple question arises: where do we turn to find new approaches to extract information from it? Not only do we have data on an unprecedented from TDA. These approaches can be broadly scale, much of this data is complex, multiscale, divided into vectorisation methods, which build incomplete or noisy. Advances in statistical an explicit feature map, or kernel-based methods, methodologies, combined with powerful which build kernels on the topological summary. computers, have provided new avenues to Once such a map is constructed, either explicitly harness the information held in datasets – but is or implicitly, these frameworks are amenable to it enough? Where can we find the new ideas that combining TDA with machine learning. will allow us to build upon existing techniques A wide variety of applications have already and unlock vital information from vast datasets? benefited from TDA, including image processing, Heather Harrington It is a smart move to look towards text analysis, chemistry, electromagnetism, Royal Society University mathematics, and topology is a branch of materials science, fundamental physics, mathematics that is teeming with exciting neuroscience and medicine.