MIT researchers have reportedly designed a neuromorphic chip that behaves like real human brain cell connections.
While experts have made significant progress in the field of artificial intelligence in recent years, replicating human brain activity is still a major hurdle to true AI. But things might change now that researchers from the Massachusetts Institute of Technology have been able to design a neuromorphic chip with artificial synapses.
Our brain is said to be packed with around a hundred billion neurons. Each neuron communicates instructions to other neurons via synapses. According to research, a human brain has over a hundred trillion of these synapses that connect the neurons. It enables the brain to recognize patterns, remember facts, and accomplish other tasks in a flash.
Apparently, getting a computer to simulate brain activity is easier if there’s a hardware that would be designed to act more like a brain. That’s what the MIT researchers in the emerging field of neuromorphic computing did with their new computer chip.
MIT’s Neuromorphic Chip
Most processing chips use binary and on/off signaling to carry out computations. However, this new neuromorphic chip is designed to work in an analog fashion.
“The elements of a “brain on a chip” would work in an analog fashion, exchanging a gradient of signals, or “weights,” much like neurons that activate in various ways depending on the type and number of ions that flow across a synapse,” MIT News reported.
Like the brain, researchers would be able to make several brain chips to efficiently process millions of streams of parallel computations. This is something that is only currently possible with large banks of supercomputers.
Furthermore, the MIT chips were also designed with artificial synapses that enable researchers to control the strength of the electric current flowing into it.
Previous designs of the neuromorphic chip used two conductive layers separated by an amorphous medium. However, one challenge to this approach is that if there are no specific structures to travel along, the signals have an infinite number of paths.
“Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way,” Jeehwan Kim, lead researcher of the study, said.
“But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects. This stream is changing, and it’s hard to control. That’s the biggest problem – nonuniformity of the artificial synapse.”
Replicating A Neural Synapse
To resolve this issue, Kim and his colleagues used silicon germanium, a material also used in transistors, and a silicon wafer. The two form a funnel-like dislocation which creates a single path where ions can flow.
The neuromorphic chip was equipped with the artificial synapses made from the silicon germanium material that measured about 25 nanometers. During an experiment, the researchers applied a voltage to each synapse. They found that all showed nearly the same flow of ions.
“This is the most uniform device we could achieve, which is the key to demonstrating artificial neural networks,” Kim said.
As a final test, the MIT researchers ran “a computer simulation of an artificial neural network consisting of three sheets of neural layers connected via two layers of artificial synapses” to know if the device would be able to carry out actual learning tasks.
Tens of thousands of samples from the handwritten dataset used by neuromorphic designers were fed to the simulation. Surprisingly, the artificial neural network hardware was able to recognize the handwritten samples 95 percent of the time.
Right now, the team is in the process of fabricating a neuromorphic chip that would perform handwriting-recognition tasks in reality–not just in simulation. The researchers believe that their design will enable smaller, more portable neural networks. These networks will perform complex computations that at the moment are only possible with supercomputers.
“Ultimately we want a chip as big as a fingernail to replace one big supercomputer,” Kim said. “This opens a stepping stone to produce real artificial hardware.”