Monday 23 January 2017

Neuromorphic Computing: A Disruptive Technology for High-Performance Computing



Neuromorphic computing is inspired by the functioning of a human brain which makes it possible to encode information more efficiently than the present computer chips. In the von Neumann architecture-based processors available today, the data shuttles between the processor and the storage system for the execution of each instruction, while in neuromorphic chips, processing and storage functions are integrated; thus, each neuron can process small piece of information and stores it locally. Synapses help in data communication and in connecting the neurons, thereby enabling biological brain-like functions for neuromorphic chips. Such fundamental changes in the architecture of neuromorphic processing chips enable massive parallel execution of information, thereby significantly increasing the speed of processing with low power consumption. 
End of Moore’s law would lead to surge in neuromorphic computing adoption

In the next few years, end of Moore’s law—the number of transistors in a dense integrated circuit doubles approximately every two years—would result into reduced space between electrons and holes and would lead to problems such as current leakage and overheating in ICs. 


These problems would lead to slower performance, high power consumption by ICs, and reduced durability. Thus, the need for finding an alternate way to increase the computational power of chips has fueled the development of neuromorphic chips. Neuromorphic chips can be 176,000 times more efficient in running brain-like work load compared to modern CPUs. Another important factor is the reduction in size of a transistor would result into increase in cost. As a result, researchers are evaluating different approaches to conduct large-scale computation models, which are inspired by biological principles. 

The neuromorphic computing market, by offering, is expected to be valued at USD 6,591.1 thousand in 2016 and is likely to reach USD 272,915.7 million by 2022, at a CAGR of 86.0% between 2016 and 2022”, says Sachin Garg who tracks global market for semiconductor at research firm MarketsandMarkets.

Ongoing Developments in Neuromorphic Computing

Various development projects have been undertaken by companies and research organizations to develop neuromorphic computing systems. Engineers in the Stanford University (U.S.) developed a circuit board, Neurogrid, which simulates one million neurons connected by six billion synapses connected in structured patterns and consume 100,000 times lesser power compared to a supercomputer.  

The Freie University, in collaboration with the University of Bielefeld, Kirchoff Institute for Physics, and Heidelberg, is working on designing a neuromorphic chip and software modeled on insects’ odor-processing systems and developing it to recognize plant species by their flowers. The BrainScaleS neuromorphic system has been developed at the University of Heidelberg (Germany) which was a collaboration of 19 research groups from 10 European countries, funded by the European Union. SpiNNaker, a parallel low-power neuromorphic supercomputer, was built at the Manchester University in the U.K. and was funded by the U.K. government until early 2014.  
IBM Corporation (U.S.), HP Enterprise (U.S.), Samsung Electronics Limited (South Korea), Intel Corp. (U.S.), HRL Laboratories, LLC (U.S.), General Vision Inc. (U.S.), Applied Brain Research, Inc. (U.S.), and BrainChip Holdings Ltd. (U.S.) are some of the major companies in the neuromorphic computing market. In 2011, IBM (U.S.) unveiled what it calls TrueNorth, a custom-made, brain-like chip that builds on a simpler experimental system.

TrueNorth is equipped with 4,096 processor cores, and it replicates one million human neurons and 256 million synapses—two of the fundamental biological building blocks that make up the human brain. It is the largest chip IBM has ever built at 5.4 billion transistors and has an on-chip network of 4,096 neurosynaptic cores. Samsung Electronics (South Korea) is using IBM’s TrueNorth chip for its machine-vision project to develop image processors.  HRL Laboratories (U.S.) is testing its neuromorphic chips for drones for surveillance and target tracking applications. The Institute for Neuroinformatics (Switzerland) has developed neuromorphic vision sensors, silicon cochlea, and medium-scale neuromorphic processors such as the Reconfigurable On-Line Learning Spiking (ROLLS) and cxQuad chips which use sub-threshold analog circuits. 

In May 2016, General Vision released a new version of its NeuroMem API, compatible with two commercial chips featuring its neural network technology on silicon: (1) General Vision’s CM1K chip and (2) the new Intel’s Curie model.

Adoption of Neuromorphic Computing by Government Agencies and Institutions

The U.S. government is increasing its support of new paradigms, including neuromorphic and quantum computing, to maintain its position as the leader in the field of high-performance computing. In line with this, the U.S. government’s Lawrence Livermore National Laboratory (LLNL) (California, U.S.) has purchased a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM. Based on IBM’s neurosynaptic chip “TrueNorth”, this platform would process an equivalent of 16 thousand neurons and 4 billion synapses and consume mere power of 2.5 watts. LLNL would use this new system to explore new computing capabilities in cybersecurity, control the U.S. nuclear weapons, and manage agreements to keep watch on nuclear weapons in the world.

The Defense Advanced Research Projects Agency provided funds to HRL Laboratories’ Center for Neural and Emergent Systems to develop a chip with 576 silicon “neurons” to be used
in a drone aircraft. IBM Watson, a technology platform that uses natural language processing (NLP) and machine learning to reveal insights from large amounts of unstructured data, has initiated partnerships with various major cancer institutes and clinics to derive personal insights from the cancer patients’ deoxyribonucleic acid (DNA). European Union, the Heidelberg University, and the University of Manchester have invested more than USD 100 thousand on “Human Brain Project (HBR)”, which is functional to reconstruct the complex functional role of a human brain and simulate it using parallel architecture.  

Emerging Applications of Neuromorphic Computing

As an immediate application, neuromorphic chips can be integrated with the CPU, thereby enhancing its capabilities of pattern recognition. Parallel connected neurons can perform operations such as
image classification, speech recognition, and data mining at a greater speed than that by a conventional CPU, thereby speeding up the overall pattern recognition tasks, while releasing operation load on the CPU. In 2012, IBM Sequoila, a supercomputer build on the Von Neumann architecture, simulated brain using 500 billion neurons and 100 trillion synapses. The supercomputer could simulate brain functioning at a 1/1,500 of actual and consumed 12 GW of power. Similar simulation using neuromorphic chip requires 35 KW of power. 

The new NeuroMem API is available for the Intel Arduino/Genuino 101 and for the General Vision’s BrainCard. In August 2015, IBM developed “rodent brain” chips that are designed with 48 million nerve cells, which is approximately equal to the number of neurons in a rodent head. These chips are used to identify images, recognize spoken words, and understand natural language processing (NLP). Neuromorphic chips can execute instructions rapidly and consume low power. They are highly efficient in pattern recognition which is why can be used for computational applications in various verticals such as military and defense and  information technology (IT), among many others. These chips can be integrated with drones, smartphones, automobiles, search engines, and climate prediction equipment, among others. 

Using pattern recognition, a drone can identify the region it is flying over and respond accordingly. In a smartphone, pattern recognition helps in capturing and tagging images based on objects captured and also in recognizing the voice of an individual. In the automotive sector, pattern recognition can be used to understand the health or position of a driver and adjust the driver’s seat automatically to provide optimum comfort. Moreover, pattern recognition capabilities can be used in search engines to enhance performance, detect frauds in financial markets, and predict climate by using equipment to anticipate climatic conditions accurately. 

Contact:
Mr. Sachin
MarketsandMarkets
Email ID: sachin.garg@marketsandmarkets.com

For more information Visit:
Neuromorphic Computing Market by Application - 2022

No comments:

Post a Comment