IMPACT of ARTIFICIAL INTELLIGENCE and NATURAL LANGUAGE PROCESSING on PROGRAMMING and SOFTWARE ENGINEERING - Radha Guha, Ph.D

IMPACT of ARTIFICIAL INTELLIGENCE and NATURAL LANGUAGE PROCESSING on PROGRAMMING and SOFTWARE ENGINEERING - Radha Guha, Ph.D

International Research Journal of Computer Science (IRJCS) ISSN: 2393-9842 Issue 09, Volume 07 (September 2020) https://www.irjcs.com/archives IMPACT OF ARTIFICIAL INTELLIGENCE AND NATURAL LANGUAGE PROCESSING ON PROGRAMMING AND SOFTWARE ENGINEERING - Radha Guha, Ph.D. Professor of Computer Science and Engineering [email protected],[email protected] Publication History Manuscript Reference No: IRJCS/RS/Vol.07/Issue09/SPCS10083 Received: 29, August 2020 Accepted: 03, September 2020 Published: 14, September 2020 DOI: https://doi.org/10.26562/irjcs.2020.v0709.003 Citation: Radha Guha(2020). Impact of Artificial Intelligence and Natural Language Processing on Programming and Software Engineering. IRJCS:: International Research Journal of Computer Science, Volume VII, 238-249. https://doi.org/10.26562/irjcs.2020.v0709.003 Peer-review: Double-blind Peer-reviewed Editor: Dr.A.Arul Lawrence Selvakumar, Chief Editor, IRJCS, AM Publications, India Copyright: ©2020 This is an open access article distributed under the terms of the Creative Commons Attribution License; Which Permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Abstract: The recent beta release of artificial intelligence natural language processor GPT-3 by OpenAI in June 2020, is triggering a lot of excitement in machine learning community as well as in software developers. The full form of GPT is generative pre-trained transformer. GPT-3 is a general purpose transformer based text generator, largest deep neural network model ever built by any artificial intelligence (AI) company. The mind blowing capability of the transformer based natural language processor (NLP) model is fascinating as well as awe inspiring to the programming community. It is a language model that takes input text in natural language and predicts output text in another form. It can write fiction, generate programming code and design websites, prompted by natural language description. This paper investigates how advancement in AI and NLP are impacting programming, software development process and software products and services. This paper suggests how software developers have to reposition and re-skill in this disruptive technology era when an AI language model like GPT-3 has answer to every question posed in natural language. In this paper first the triumph of AI in the last two decades culminating to the latest GPT-3 model is described and its capabilities are looked into. Then evolution of programming languages and software development methodologies in the changing scenario of parallel and distributed computing of big data, cloud computing, web services and AI integration are followed. What is observed is that now a day computer applications and mobile apps are just integration of conglomerate of components accessed over the internet through application programming interfaces (APIs). Thus the incumbent programmers are informed about this digital transformation, and machine learning and AI NLP advancement at a very rapid space; to prepare themselves for future software engineering needs and demands. Keywords: Artificial Intelligence (AI), Natural Language Processing (NLP), Machine Learning, Programming Evolution, Software Engineering Evolution, GPT-3 I. INTRODUCTION: History of Artificial Intelligence Before talking about artificial intelligence let’s see what is natural intelligence. Natural intelligence of human incorporates perceiving and analysing visual scene, understanding and speaking natural language like English, computing arithmetic and logic operations, having memory, continuously learning from new experiences and inferring for generalization and decision making. In artificial intelligence, we strive to teach a computer to mimic human intelligence. Human brain consists of approximately 100 billion neurons and those neurons are connected with each other with 100 to 500 trillion synapses. The neurons process the external stimuli from the nervous system and transmit the information through the synapses to other neurons to carry out all the complex tasks what we call natural intelligence. The history of artificial intelligence [1], [2], [3], [4], [5], [6] is like this. In 1943, McCulloch and Pitts theorized a computing element is similar to a biological neuron of human brain. An illustration of a biological neuron and an artificial neuron is shown in Figure 1. _______________________________________________________________________________________________________________________________________________ © 2014-20, IRJCS- All Rights Reserved Page-238 International Research Journal of Computer Science (IRJCS) ISSN: 2393-9842 Issue 09, Volume 07 (September 2020) https://www.irjcs.com/archives In the artificial neuron, output is a function (either Sigmoid or tanh or ReLU) of sum of n inputs (xi) multiplied by connection strengths (wi) given by: output =(∑=1 ). McCulloch and Pitts, postulated that a network of these computing elements [Figure 2] with adjustable weights of interconnects resembling synapses in human brain, will be able to solve any computable function. Figure 1: Biological Neuron vs. Artificial Neuron Figure 2: Multi-Layer Perceptron (MLP), a Neural Network Architecture In the mean time, after the invention of silicon transistors in 1947, fist time digital computers were built. Since then advancement in computer hardware and software in tandem is creating computing revolution. In 1950, American computer scientist, Alan Turing defined that if a computer passes ‘Turing Test’ it can be declared intelligent. If a human user can be fooled after interacting with a computer, as if he has interacted with another human then the computer passes the Turing Test. In 1956, John McCarthy, professor of Stanford University coined the term ‘Artificial Intelligence’ (AI) for the first time and founded the AI discipline. He said intelligent features like learning can also be imparted to a computing machine. In 1962 Frank Rosenblatt explored that network of perceptrons, a mathematical model of biological neurons as in Figure 2, are capable of learning and pattern recognition in input data. They then employed this NN architecture to solve simple puzzles and play computer games. Even though AI research started with bells and chimes during those days in 1950s and 1960s, it could not make much progress for the dearth of computing power. In the last two decades, AI research in deep neural network learning has picked up the steam with the advent of more powerful computers and parallel processing hardware [7], [8], [9]. Sequential processor i.e. Intel CPU’s performance (MFLOPS) could be increased steadily only till the end of the previous century. Then parallel processing in graphics processing unit (Nvidia GPU, 1999) came to existence, whose throughput is much higher than CPUs. Then Google and Yahoo came up with big data processing hardware platform of Hadoop (2006), MapReduce and Spark etc. for distributed parallel processing of big data. Now Google has created application specific integrated circuit (ASIC) hardware frame work called tensor processing unit (TPU, 2018) as AI accelerator, specifically for neural network based machine learning (ML) algorithms. Machine learning and deep learning algorithms, needing matrix multiplication and addition are parallelizable and benefit from the parallel architecture of GPU and TPU. To ease the implementation of complex ANN based ML models, Google has come up with TensorFlow software platform in Python programming language, which is now open source from 2015 onwards. For large scale production deployment of software, Python API PyTorch exploits the power of GPU and TPU. Also in the beginning of 1990s, Internet opened the door for accessing information from anywhere at any time. This started a smart revolution in the world with digital transformation, use of Internet of Things (IOTs) [10], [11], smart sensors, machine learning and deep learning algorithms to build smart products and services. Since then the amount of unstructured text data in the internet catapulted and it is growing with exponential velocity with every passing day. From late 1990s we are seeing smart AI products like robots for autonomous control in the industry. _______________________________________________________________________________________________________________________________________________ © 2014-20, IRJCS- All Rights Reserved Page-239 International Research Journal of Computer Science (IRJCS) ISSN: 2393-9842 Issue 09, Volume 07 (September 2020) https://www.irjcs.com/archives Also software agents are getting used in the search engine for collecting relevant information by crawling in the internet. Businesses now can employ ML algorithms to analyse this big historical data to take wise decision for future. Now a day, every major technology companies are investing heavily on artificial intelligence products and services to build a smarter cognitive environment that blurs the gap between man and machine. In 1996, IBM’s DeepBlue computer defeated then reining world chess champion Gary Casparov. In 2005 Honda’s ASIMO was the best humanoid robot. In 2011 IBM’s Watson, a question answer Chabot defeated the champion of the general knowledge game show Jeopardy on American national TV. From 2000 onwards, several social media networking sites like Facebook, Twitter and Linkedin etc. have popped up for online engagement of users. Born in 2004, Facebook is the most popular free social networking site to stay connected

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us