×

Neuromorphic Computing: The Next-Level Artificial Intelligence

Neuromorphic Computing: The Next-Level Artificial Intelligence

May 27, 2020

Can AI function like a human brain? - Researchers have been asking this question for decades and their long-drawn-out pursuit has invoked doubts, ridicule, scorn, and what not.

But now, armed with Neuromorphic Computing, they are ready to show the world that their dream can change the world for better. As we unearth the benefits, the success of our machine learning and AI quest seem to depend to a great extent on the success of Neuromorphic Computing.

How Neuromorphic Computing Can Help Contemporary AI

The technologies of the future like autonomous vehicles and robots will need access to and utilization of an enormous amount of data and information in real-time. Today, to a limited extent, this is done by machine learning and AI that depend on supercomputer power. But these needs are extending and speed, power, and size are emerging as prime impediments.

Neuromorphic Computing chips can process multiple facts, learn tasks and patterns at high speed. These chips are expected to consume less power (up to 1000 times less) and can work with the efficiency of supercomputers.

Neuromorphic Computing chips, a crucial upgrade in traditional systems, are compact, portable, and energy-efficient. They are the perfect sidekick ML and AI models need.

Researchers know it and so they are leaving no stone unturned. While a few are getting as literal as copying the physical form of the human brain, others are trying to replicate its function. It’s the latter we are optimistic about as they are expected to replace Gordon Moore’s groundbreaking idea of “packing transistors onto substrates”.

Neuromorphic computing involves the production and use of neural networks to function like a human brain, making decisions and also memorizing information and analyzing facts. It “demonstrates an unprecedented low-power computation substrate that can be used in many applications."- IBM neuromorphic patent application

Recent Developments

Though many innovators are driven to the cause, a few are leading the race. Here’s a look at important developments:

Intel’s Loihi - The Future of GPUs

A 14-nanometer chip with over 2 billion transistors and three managing Lakemont cores. “It contains a programmable microcode engine for on-chip training of asynchronous spiking neural networks (SNNs). Total, it has 128 cores packs. Each core has a built-in learning module and a total of around 131,000 computational “neurons” that communicate with one another, allowing the chip to understand stimuli.”

Loihi can identify ten hazardous materials by smelling them faster than sniffer dogs. It can also detect toxic fumes and diseases around it and can re-wire itself to facilitate different forms of learning.

In the future, it is expected to learn from experiences and make decisions on its own. Icing on the cake, it uses a fragment of energy and is expected to replace GPUs.

IBM’s TrueNorth - The Hercules of Transistor Count

It has 4,096 cores, Samsung’s 28nm process with 5.4 billion transistors. It is IBM’s largest chip in transistor count and uses less than 100Mw of power while simulating complex recurrent neural networks. It has a power density of 20mW / cm2.

TrueNorth’s architecture can address the problems of “vision, audition, and multi-sensory fusion, and has the potential to revolutionize the computer industry by integrating brain-like capability into devices where computation is constrained by power and speed.”

IBM says it can efficiently process “high-dimensional, noisy sensory data in real-time”. TrueNorth consumes less power than a conventional computer.

MIT’s - Brain on A Chip

A chip built from silicon geranium and with “more than 100 trillion synapses that mediate neuron signaling in the brain”. In one simulation it represented human handwriting with 95 percent accuracy. It could be used in making humanoids and autonomous driving technology.

Qualcomm’s - Zeroth processors

Working on three main goals of “biologically inspired learning; enabling devices to see and perceive the world as humans do and; creating and defining Neural Processing Unit—NPU”, Qualcomm is developing new computer architecture that dismantles the traditional mold.

The Road Ahead

Neuromorphic computing can greatly impact the future of machine learning and AI. “These new kind of chips should increase dramatically the use of machine learning, enabling applications to consume less power and at the same time become more responsive.”-Deloitte market analysis

With Neuromorphic Computing at its side, the future of AI sure looks bright.

Trivia:

Did you know?

  • Neuromorphic Computing is the 5th generation of AI.

  • The 1st generation AI defined rules and followed classical logic to arrive at conclusions within a specific, narrowly outlined problem domain.

  • The 2nd generation AI used deep learning networks to analyze the inputs and were focused on sensing and perception.

  • The 3rd generation AI interpreted and adapted like the human thought process.

  • The 4th generation AI used a mix of different machine learning algorithms and other forms of Artificial Intelligence algorithms to achieve their goal or mission.

Follow Us!

Stay Updated On 
Latest Trends in AI Here!

Get Started