Researchers have reportedly developed a new simulation technology that can mimic the entire human brain for future exascale systems.

In another breakthrough in the field of neuromorphic computing, an international group of researchers has created a simulation technology that can be used to mimic brain-scale networks on future supercomputers of the exascale class.

In the study, published in the journal Frontiers in Neuroinformatics, the researchers reported how their new algorithm could represent a larger portion of the human brain while using the same amount of computer memory.

An international group of researchers has reportedly developed a simulation technology that could mimic the human brain. A breakthrough in the field of #Neuromorphic computing. #DataScienceClick To Tweet

At the same time, the algorithm can also speed up the brain simulation on present-day supercomputers.

“Since 2014, our software can simulate about one percent of the neurons in the human brain with all their connections,” Markus Diesmann, Director at the Jülich Institute of Neuroscience and Medicine (INM-6) in Germany, said.

For the researchers to accomplish this groundbreaking feat, they utilized the entire memory of petascale supercomputers like the K computer located in Japan and JUQUEEN in Germany.

“We run benchmarks on three HPC systems that are commonly employed for (neuro)scientific research: the JUQUEEN BlueGene/Q and JURECA systems at the Jülich Research Centre, Germany, and the K computer at the Advanced Institute for Computational Science in Kobe, Japan,” the researchers wrote in their paper.

K Computer in Kobe, Japan used to develop the new simulation technology
K Computer in Kobe, Japan | Toshihiro Matsui | Wikipedia.org

The New Brain Simulation Technology

Even with the most advanced technology we have today, creating a simulation technology to copy the functionalities of the brain is still considered an impossible feat.

With over 100 billion interconnected nerve cells in our brain, it’s clear to see how incredibly complicated it would be to simulate it.

Even with Diesmann and his team’s achievement, their algorithm could only simulate about one percent of the neurons in a human brain.

For over 20 years, Diesmann has been working on the simulation software called NEST; a free, open-source simulation code widely used within the neuroscientific community.

It is also considered the core simulator of the European Human Brain Project, where Diesmann leads several projects in the fields of Theoretical Neuroscience and on the High-Performance Analytics and Computing Platform.

“NEST is an open-source software tool that is designed for the simulation of large-scale networks of single-compartment spiking neuron models…The collaborative development of NEST follows an iterative, incremental strategy derived from the requirements and constraints given by the community.”

Read More: Neuromorphic Chip That Simulates Brain Activity Developed by Researchers

Through NEST, the researchers were able to represent each neuron in the network using a number of mathematical equations.

This, according to the team, may pave the way for exascale computers to surpass the capabilities of existing high-end supercomputers by 10- to 100-fold.

Aside from that, it also gives the researchers the computational power they need to simulate neuronal networks on the scale of a human brain.

To date, one significant limitation faced by researchers in developing neuromorphic supercomputers has to do with memory capacity.

Apparently, this issue is due to the vast number of cellular structures needed and the scale of neuronal connectivity.

To replicate an entire human brain, it would require a tremendous amount of memory that even supercomputers today don’t have.

Right now, the NEST is set up in a way that all data representing the neurons and connections are stored on each node where the memory limitation is being encountered.

“Ultimately, the problem is that while the memory needed for the simulation increases linearly with the size of the neural network, memory capacity per node is growing more slowly. More to the point, the number of processor cores per node will increase considerably in exascale supercomputers, but memory per core is essentially going to stay the same,” Michael Feldman of Top 500explained.

To address this problem, the NEST developers used a different approach with the new simulation technology that they made.

Now, each node no longer requires memory to increase in concert with the size of the neural network.

“Instead, the neural network is pre-processed to determine which neurons are most likely to be interacting with each other, and that information is used set up the data structures accordingly. As a result, they were able to limit the amount of memory needed on a given node.”

The algorithm reportedly improved the simulation parallelism which speeds it up dramatically.

In one case, the JUQUEEN was able to simulate a second of biological time across 520 million neurons and 5.8 trillion synapses. The computation time was also reduced from 28.5 minutes to only 5.2 minutes.

The researchers believe that if the new algorithm can scale linearly on an exoflap supercomputer with the same design, the 5.2 minutes recorded time could be reduced even further to 15 seconds. A whopping development and a huge step in the researcher’s goals.

“The combination of exascale hardware and appropriate software brings investigations of fundamental aspects of brain function, like plasticity and learning unfolding over minutes of biological time, within our reach,” Diesmann was quoted as saying.

Do you believe that this new brain simulation technology could pave the way for new generation of supercomputers to be developed?

banner ad to seo services page