Issue 27 | Data reported 31.03.19 “Most people think that AI means that humans will be totally automated out of the value chain, but that’s not what we’re hearing from the biggest tech companies in Silicon Valley.” Leila Janah, CEO, Samasource, page 6 Photo courtesyPhoto Samasource of the rise of artificial intelligence does not engineers. There’s an increasing army of requires new ways of working, particularly THE HUMAN need to be synonymous with the fall of human data labellers who are feeding AI to facilitate human interaction. Increased human intelligence. with the information it needs to function. interaction is vital, as our Provocateur ADVANTAGE IN In this issue of the Review we look at In our lead interview we talk to Leila Janah, Damian Hughes points out, ‘no AI system the impact that AI has already started to the founder and CEO of Samasource, which will ever help the bottom line as much as have. While it’s very clever, it does lack has created thousands of AI jobs in Africa. a strong team’. AN AI WORLD any common sense, and it is still critical for Closer to home, Dr Peggie Rothe AI is not the enemy, but it’s also not the Consider this: the AI era could also people to question AI’s decision-making to analyses how technology has affected answer. What we do know is that creating usher in a new era of human creativity ensure the correct goals are being achieved. our workday; instead of doing fewer tasks a people-centric culture, championing and initiative. Contrary to what the doom In fact, behind every AI innovation in a day, employees are doing more than empathy, collaboration and innovation, and gloom headlines would have us believe, is a person, and they aren’t all software ever before. The increase of technology is a proven strategy for success. Opinion | Tim Oldman, Founder & CEO, Leesman “I’M SORRY DAVE, I’M AFRAID I CAN’T DO THAT” In Stanley Kubrick’s 1968 epic 2001: In an application features list, this A Space Odyssey, the mission’s might appear a cool new bit of predictive unnervingly sentient ‘Heuristically functionality. But it left us having to programmed ALgorithmic’ computer investigate how to manually override known as HAL, self-assuredly proclaims or cancel something others thought we’d himself “by any practical definition of the appreciate, as it suddenly dramatically word, fool proof and incapable of error”. threw our management information Just 43% of employees globally dashboards off course. acknowledge that ‘learning from others’ is The email came at the end of a bad week an important work activity 1. Tim Minshall, in the news for the designers of similar, Head of the Institute for Manufacturing though admittedly hugely more complex, at the University of Cambridge, describes systems. Ethical hackers at the Tencent Knowledge Transfer as a “contact sport; Keen Security Lab had exposed that it works best when people meet to exchange software flaws in Elon Musk’s Tesla S – ideas – it’s all about the transfer of tangible one of the most advanced cars on the road and intellectual property, expertise, – had enabled them to confuse the vehicle’s learning and skills”. lane recognition system into thinking HAL’s declaration is of course flawed the straight road ahead actually curved, and in his eventual downfall, Kubrick and in so doing, switch lanes directing the and collaborator Arthur C Clarke were vehicle into the path of oncoming traffic. almost certainly exploring what we might Most interestingly, the ‘fake lane attack’ today refer to as algorithmic or machine was an example of a new type of incredibly learning bias, a phenomenon that occurs low-tech hack, comprising of nothing when an algorithm produces results that more than a series of strategically placed are systematically prejudiced due to ‘interference stickers’ on the road surface, erroneous assumptions in the machine so that perversely the very image learning process. Fifty-one years later, the recognition systems designed to keep balance between science fact and fiction the vehicle in the centre of its lane, is no less blurred. Even an innocuous- would do the very opposite. seeming software or systems upgrade can Musk’s typically tactical response have irritating, disruptive, or at worst, complimented the discovery as “solid devastating effects. Take, for example, work” that would help accelerate the Leesman’s recent experience: advancement of such systems. Then dismissed it as somewhat irrelevant, since the ‘autopilot’ could be overridden at any “The balance between moment and that the systems were never science fact and fiction intended to offer an automatous experience that would replace the need for an adult is still blurred. even an behind the controls of the vehicle. innocuous-seeming software But this came in the same week that Ethiopian air accident investigators or systems upgrade can have announced that the captain and first officer irritating, disruptive, or at of the Ethiopian Airlines Boeing 737 Max that had crashed three weeks prior, had worst, devastating effects.” indeed correctly taken manual control of their aircraft and followed all of the specified emergency procedures laid Friday April 5th: an internal email from out by Boeing. Despite their efforts, our Dr Peggie Rothe warns all client-facing all 157 lives on board were lost. team members that our client relationship Initial investigations centre on the role of management (CRM) and workflow the aircraft’s Manoeuvring Characteristics feedback he is getting back from it, can 700ft of altitude moments later before pilots application had developed an anomaly. Augmentation System (MCAS). These put the pilot at risk of pitching the aircraft halted the drop. This pattern continued. Team members create a project ‘card’ on systems were already under industry scrutiny at too steep an angle during take-off. Details from the initial Ethiopian receipt of a client enquiry and we track its following the loss of Indonesia’s Lion Air 737 MCAS will kick in if sensors detect the Airlines investigation suggest that faced development and velocity towards becoming Max flight JT610, 13-minutes after taking aircraft’s Angle of Attack (AOA), or climb, with immediate difficulty controlling the a live project. As a go-live project date off from Jakarta in October 2018, killing all is too aggressive to prevent the aircraft aircraft’s initial climb, perhaps as a result develops greater certainty, a ‘close date’ 189 people on board. The last moments of stalling. It automatically adjusts the of the non-linear lift characteristics, the is added to the card, directly informing both flights exhibited strong similarities. ‘elevators’ at the tail of the aircraft to point pilot sought the support of the autopilot and our workflow and cashflow analysis. Boeing’s MCAS is a software protocol the nose downward. The MCAS should only engaged it almost immediately. Cockpit voice But the Doc, feared and famed for her unique to the 737’s Max variant, introduced engage when the autopilot is disengaged – recordings are reported to confirm that the attention to the finest numeric detail, had principally in response to different in- that predominantly being when the pilot captain called out three times to “pull up”, spotted that all new cards were suddenly flight handling characteristics of the Max is ‘hand flying’ at take-off or landing. and seconds after instructed the first officer including close dates seemingly of their resulting from the specification of new 10% The yoke is designed to shake violently to tell Air Traffic Control that they had a own design. Her hypothesis was that the more-efficient engines. These new engines if a stall is detected. However, evidence from flight control problem. application’s developers had added new display ‘non-linear lift’ characteristics, the Lion Air flight shows this happening With autopilot failing to help the functionality that would predict a close which will mean very little to most, and at despite the aircraft at that point not being situation, it was disengaged by the date based on the characteristics of the the risk of grossly over simplifying the issue, at risk of stall. The flight crew were also flight crew. But as the subsidiary MCAS project card; client name, card value, project the aircraft has a different centre of gravity contending with incorrect altitude and automation appears to kick in, it each time complexity, person creating the card etc. and that the physical instructions the pilot airspeed readings. Flight data shows the appears to worsen the problem. Reuters all presumably based on historic patterns. is inputting through the yoke and the Lion Air aircraft dipped and dropped was reporting that cockpit data confirmed 1 Leesman Index 2019 2 Photo by Movie Poster Art/Getty Images pilot and first officer had then both correctly aircraft handling, the flight crew’s ability system’s ‘functionality enhancement’ wasn’t As the sophistication of those systems followed emergency checklist protocols to decipher those anomalies, the various the result of a sentient computer’s insistence increases, so too should the thoroughness and manually disabled the MCAS, taking computerised systems’ ability to correct it knew better, but instead the result of the of the testing and licensing of those systems direct mechanical control of the tail or take over situation management, or insistence of a developer/engineer/data and our understanding of the unintentional stabilisers. This should have immediately for those systems to recognise they were scientist somewhere that they knew better. bias built into them; virtue of the individuals brought the nose of the aircraft back level. acting on flawed data. and/or the design/engineering systems Yet the MCAS system may have repeatedly When the aircraft eventually hit the “AI is designed by humans and steps involved in creating them.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages16 Page
-
File Size-