Experts say we Need to Start Over to Build True Artificial Intelligence

0
Experts say we Need to Start Over to Build True Artificial Intelligence
Laurent T | Shutterstock.com

We are achingly close to the development of true artificial intelligence, but some contend that a fresh start is needed to rectify missteps along the way.

Geoffrey Hinton is the godfather of AI. He conceived the “path” to neural networks as early as 1986, but now he’s saying that researchers need to start from square one. The issue lies with a prevalent tactic in AI development called “back propagation”.

What is this and why does it make Hinton question ~30 years of AI research?

AI Godfather Says 'Start Over' without Back PropagationClick To Tweet
image of Geoffrey Hinton, the godfather of deep learning, for article about AI and artificial intelligence and machine learning
Geoffrey Hinton | Johnny Guatto | University of Toronto

AI Godfather Questions Developments in Memory Storage

Geoffrey Hinton has been called the “Godfather of Deep Learning”.

He has to be one of the first to receive a Ph.D. in artificial intelligence. No really–he received one in 1977 from the University of Edinburgh. Since then, Hinton pursued research with the end goal of AI. He currently spends his time between Google and teaching at the University of Toronto.

How memory works and stores information is his main interest regarding the development of AI. That’s why the concept of “back propagation” irks him so. It relates directly to how AIs learn and store information. Since machine learning and deep learning are the keys to unlocking true AI, Hinton’s claims are ground-shaking.

image of equations for back propagation used in development of AI for artificial intelligence featuring Geoffrey Hinton, the godfather of deep learning and machine learning
Neural Networks and Deep Learning | Michael Nielson

The first inklings back propagation stirred were as early as the 1960s. But the notion took root in 1986 when Hinton and his cohorts, Ronald Williams and David Rumelhart, penned this paper on the subject. All those equations there might seem like gibberish, but the results were mind-blowing.

They became the basis of all machine learning.

Crux of AI Programming Actually a Crutch

image of man getting his mind blown for article about AI and artificial intelligence featuring godfather of deep learning Geoffrey Hinton
Tim & Eric Awesome Show, Great Job | Imgur

Since their conception, back propagation algorithms have become the “workhorses” of the majority of AI projects. It isn’t so much about speed, but understanding finer details for better comprehension overall.

The way it works, however, is surprisingly mechanical in that it uses a labeling system in the form of “weights”. In a neural layer similar to the human brain, photos or voices with various weights get adjusted repeatedly.

The layer-by-layer process repeats until the neural network can efficiently and effectively perform the appropriate functions with few to no errors.

As Hinton contends, this process is antithetical to the creation of true AI.

Neural networks need to learn on their own in an “unsupervised” way, as Hinton puts it. If they have to catalog potentially billions of images, sounds, and inputs in order to think, then that’s not really artificial intelligence.

Re-examining Memory, Processing, & More

In-kind with Hinton, other technological minds move away from “traditional” schools of thought when it comes to AI or higher level processing. Designs similar to the complex human brain may soon replace the almighty chip as preferred processors.

image of Xuedong Huang for article about AI and artificial intelligence featuring Geoffrey Hinton, godfather of machine learning and deep learning
Xuedong Huang | Scott Eklund | Red Box Pictures via CRM Daily

In the interim, researchers turned to GPUs for speech recognition. Yes — graphics processing units. Specifically: Nvidia.

The issue with this approach is simple computing power. Xuedong Huang and his team at Microsoft created an AI that recognized words better than humans in fall 2016.

And while AIs are geared to learn quickly and autonomously, this approach requires much more trial and error. Similarly to how much work goes into back propagation, hundreds of algorithms crunch the inputs. This, of course, requires intense amounts of computing power.

But that hasn’t stopped companies from pursuing this line of research. Google, Qualcomm, and Nvidia are all working on (or already developed) proprietary chips.

Are we as close to the advent of true artificial intelligence as we thought? If back propagation isn’t the answer to AI development, what is?

banner ad to seo services page

LEAVE A REPLY

Please enter your comment!
Please enter your name here