Neuromorphic chips are becoming mainstream

One idea to solve the excessive power consumption of artificial intelligence chips is to further imitate the working methods of biological neurons than mainstream artificial neural networks, and this method is also called “neuromorphic” (neuromorphic).

Artificial intelligence has entered our daily lives. Although artificial intelligence has achieved great success, artificial neural networks with artificial neural networks as the mainstream algorithm still face the problem of excessive computing requirements and excessive energy requirements. For example, the current level of intelligence of the human brain far exceeds the level of artificial intelligence, but the power consumption of the human brain is only on the order of 10W, which is far less than the power consumption of the server-side artificial intelligence chipset of several kilowatts. The low energy efficiency of artificial intelligence has indeed become an important obstacle for artificial intelligence to enter the edge with higher power consumption requirements. 

One idea to solve the excessive power consumption of artificial intelligence chips is to further imitate the working methods of biological neurons than mainstream artificial neural networks, and this method is also called “neuromorphic” (neuromorphic). Animal neurons will continuously receive electrical impulses from the synapses of other neurons and accumulate charges. When the charge exceeds a certain threshold, the neurons will discharge through the synaptic impulse, and the electrical impulse will be transmitted to the next through the synapse. A neuron. When in the process, only in neurons discharge (ie, active) will have energy consumption, while at the time he was waiting state are at a very low power consumption. 

Neuromorphic chips are becoming mainstream

The neuromorphic chip uses this idea and realizes the neuromorphic in the form of an integrated circuit. In the past neuromorphic chips, the most famous may be IBM’s True North and Intel’s Loihi. These two chips focus on large-scale neuromorphic, and the main scenarios are neuroscience research and other fields. In addition to the research-oriented neuromorphic chips launched by giants such as Intel and IBM, today neuromorphic chips have found commercial scenarios at the edge because of their high computational energy efficiency. 

Neuromorphic chips are becoming mainstream
Neuromorphic chips are becoming mainstream

In the edge-side neuromorphic chip design, mixed-signal circuit design is usually used, in which each nerve synapse uses analog signals (charge) to transmit impulse signals, and the process of receiving synapses and processing the internal state of neurons can use digital Circuit. As described above, since the only neurons in the number of charges to accumulate sufficient amounts of post will be activated, so most of the time in the standby state of low power consumption, thus greatly reducing power consumption. Or from another point of view, the neuromorphic chip will only process when it detects a meaningful event, thus greatly reducing the energy. Therefore, the neuromorphic chip can also be called an “event-driven processing” chip. 

From the specific design point of view, the current neuromorphic chips used at the edge can be divided into two categories, namely, neuromorphic computing chips that use neuromorphic impulse neural networks for signal processing, and event-driven vision sensors (DVS). The chip only wakes up the neuromorphic vision chip that processes the chip after the DVS detects the event. At present, these two kinds of chips have received considerable attention on the edge . 

Application scenarios of neuromorphic chips

The first major application scenario of neuromorphic chips at the edge is low-power machine vision. The main advantage of neuromorphic chips in this field is extremely low power consumption: the power consumption of traditional artificial intelligence chips based on digital circuits is usually on the order of milliwatts, while the power consumption of neuromorphic computing chips can reach the order of microwatts. It can reach several orders of magnitude. Therefore, the neuromorphic computing chip can realize always-on machine vision powered by a button battery. 

In addition, event-driven ultra-high-speed machine vision is also an important application scenario for neuromorphic vision chips. Unlike traditional fixed frame rate vision sensors, neuromorphic vision chips only start high-frequency sampling when an event occurs, so it can achieve high frame rates of thousands of fps, so it can be used in automatic driving, machine operation monitoring, etc. High sampling rate scene. In addition, it is worth mentioning that if the neuromorphic vision chip and the neuromorphic computing chip are combined, ultra-low latency and high-quality processing can be achieved when an event occurs, and the power consumption is extremely low when there is no event. So as to achieve high performance and ultra-low power consumption at the same time. Recently, the Chinese neuromorphic chip start-up company SynSense released Speck, which is such a powerful combination of SoC. It also integrates neuromorphic vision chips and neuromorphic computing chips, so as to achieve sub-milliwatts at the edge. Class-level ultra-low power consumption and low-latency machine vision, application scenarios can cover important IoT scenarios such as smart homes, smart security, and industrial monitoring. 

Neuromorphic chips are becoming mainstream

In addition to machine vision, the analysis of time series is also an important application scenario for neuromorphic computing chips. Time series signals such as speech and physiological signals (such as cardiac ECG) are processed by impulsive neural networks at very high time. Therefore, the use of neural mimicry chips based on impulsive neural networks can process these signals with high efficiency. Wearable products (such as TWS wireless headsets, etc.) have a strong demand for such low-power time series processing. With the outbreak of the wearable product market, the use of neuromorphic chips to enter these markets has also become a very important issue. The direction of potential. 

The competitive landscape of neuromorphic chips

The development of neuromorphic chips requires a deep integration of algorithms, systems and circuits. Since the neuromorphic algorithm is a relatively different algorithm, it needs considerable accumulation in algorithm development. At the same time, because the neuromorphic chips used at the edge end pursue high energy efficiency ratios, mixed-signal circuits are commonly used, and their development also has considerable demand for circuit design skills. All in all, a neuromorphic chip product requires years of in-depth accumulation in neuromorphic circuits and algorithms to be truly successful. 

BrainChip in the United States can be said to be the oldest neuromorphic chip company. It has been more than fifteen years since its establishment in 2004. Its current products can cover the ultra-low power chip market with power consumption ranging from microwatts to milliwatts. Product forms include neuromorphic chip IP and SoC chip. 

Europe is also an important town in the field of neuromorphology. In particular, the Institute of Neuroinformatics (INI) co-founded by the Swiss University of Technology Zurich and the University of Zurich is an authority in the field of neuromorphic chips and algorithms, and companies incubated from the Institute of Neuroinformatics The edge-side neuromorphic chip field has an important position. For example, the recently popular Prophesee was founded by a student of the School of Neuroinformatics. The company’s main product is a neuromorphic vision chip, which can provide ultra-high frame rates higher than 10000fps, while power consumption is less than 10mW. Financial support from industry giants or well-known capitals such as Intel, Xiaomi , and Innovation Works . 

Compared with overseas counterparts, China’s peripheral neuromorphic chip companies are not far behind. For example, Shishi Technology mentioned in the previous article was founded by Dr. Qiao Ning, a returnee scholar from the Swiss Institute of Neuroinformatics. After obtaining industry and capital support from Baidu , Merck, and Zhongke Chuangxing , it has taken the lead in launching a combined neuromorphic vision. And computing chip product Speck. In addition, in terms of neuromorphic vision chips, Xinlun Optoelectronics, founded by Professor Chen Shoushun from Nanyang Technological University in Singapore, also launched a world-leading neuromorphic vision chip a few years ago. At present, Xinlun Optoelectronics has become one of the leading semiconductors in China. A subsidiary of Weir shares . As China is a global leader in edge machine vision (smart homes and smart security, etc.) and wearable electronics, we believe that China will also become the most potential market for edge neuromorphic chips. As peripheral neuromorphic applications become mainstream, we also expect these Chinese companies to shine in this market. 

*Disclaimer: This article is original by the author. The content of the article is the author’s personal opinion. The reprint of Semiconductor Industry Observation is only to convey a different point of view. It does not mean that Semiconductor Industry Observation agrees or supports this view. If you have any objections, please contact Semiconductor Industry Observation.

Posted by:CoinYuppie,Reprinted with attribution to:https://coinyuppie.com/neuromorphic-chips-are-becoming-mainstream/
Coinyuppie is an open information publishing platform, all information provided is not related to the views and positions of coinyuppie, and does not constitute any investment and financial advice. Users are expected to carefully screen and prevent risks.

Like (0)
Donate Buy me a coffee Buy me a coffee
Previous 2021-07-19 09:56
Next 2021-07-19 10:03

Related articles