The Cognitive Computing Era is Here: Are You Ready?
By Peter Fingar. Based on excerpts from the new book Cognitive Computing: A Brief Guide for Game Changers www.mkpress.com/cc
Artificial Intelligence is likely to change our civilization as much as, or more than, any technology that s come before, even writing. --Miles Brundage and Joanna Bryson, Future Tense
The smart machine era will be the most disruptive in the history of IT. -- Gartner The Disruptive Era of Smart Machines is Upon Us.
Without question, Cognitive Computing is a game-changer for businesses across every industry. --Accenture, Turning Cognitive Computing into Business Value, Today!
The Cognitive Computing Era will change what it means to be a business as much or more than the introduction of modern Management by Taylor, Sloan and Drucker in the early 20th century. --Peter Fingar, Cognitive Computing: A Brief Guide for Game Changers
The era of cognitive systems is dawning and building on today s computer programming era. All machines, for now, require programming, and by definition programming does not allow for alternate scenarios that have not been programmed. To allow alternating outcomes would require going up a level, creating a self-learning Artificial Intelligence (AI) system. Via biomimicry and neuroscience, Cognitive Computing does this, taking computing concepts to a whole new level. Once-futuristic capabilities are becoming mainstream. Let s take a peek at the three eras of computing.
Fast forward to 2011 when IBM s Watson won Jeopardy! Google recently made a $500 million acquisition of DeepMind. Facebook recently hired NYU professor Yann LeCun, a respected pioneer in AI. Microsoft has more than 65 PhD-level researchers working on deep learning. China s Baidu search company hired Stanford University s AI Professor Andrew Ng. All this has a lot of people talking about deep learning. While artificial intelligence has been around for years (John McCarthy coined the term in 1955), deep learning is now considered cutting- edge AI that represents an evolution over primitive neural networks. i Taking a step back to set the foundation for this discussion, let me review a few of these terms. As human beings, we have complex neural networks in our brains that allow most of us to master rudimentary language and motor skills within the first 24 months of our lives with only minimal guidance from our caregivers. Our senses provide the data to our brains that allows this learning to take place. As we become adults, our learning capacity grows while the speed at which we learn decreases. We have learned to adapt to this limitation by creating assistive machines. For over 100 years machines have been programmed with instructions for tabulating and calculating to assist us with better speed and accuracy. Today, machines can be taught to learn much faster than humans, such as in the field of machine learning, that can learn from data (much like we humans do). This learning takes place in Artificial Neural Networks that are designed based on studies of the human neurological and sensory systems. Artificial neural nets make computations based on inputted data, then adapt and learn. In machine learning research, when high-level data abstraction meets non-linear processes it is said to be engaged in deep learning, the prime directive of current advances in AI. Cognitive computing, or self-learning AI, combines the best of human and machine learning and essentially augments us. When we associate names with current computer technology, no doubt Steve Jobs or Bill Gates come to mind. But the new name will likely be a guy from the University of Toronto, the hotbed of deep learning scientists. Meet Geoffrey Everest Hinton, great-great-grandso n of George Boole, the guy who gave us the mathematics that underpin computers. Hinton is a British-born computer scientist and psychologist, most noted for his work on artificial neural networks. He is now working for Google part time, joining AI pioneer and futurist Ray Kurzweil, and Andrew Ng, the Stanford University professor who set up Google s neural network team in 2011. He is the co-inventor of the back propagation, the Boltzmann machine, and contrastive divergence training algorithms and is an important figure in the deep learning movement. Hinton s research has implications for areas such as speech recognition, computer vision and language understanding. Unlike past neural networks, newer ones can have many layers and are called deep neural networks.