Abstract
With an exponential increase in the amount of data collected per day, the fields of artificial intelligence and machine learning continue to progress at a rapid pace with respect to algorithms, models, applications, and hardware. In particular, deep neural networks have revolutionized these fields by providing unprecedented human-like performance in solving many real-world problems such as image or speech recognition. There is also significant research aimed at unraveling the principles of computation in large biological neural networks and, in particular, biologically plausible spiking neural networks. This article presents an overview of the brain-inspired computing models starting with the development of the perceptron and multi-layer perceptron followed by convolutional neural networks (CNNs) and recurrent neural networks (RNNs). This article also briefly reviews other neural network models such as Hopfield neural networks and Boltzmann machines. Other models such as spiking neural networks (SNNs) and hyperdimensional computing are then briefly reviewed. Recent advances in these neural networks and graph related neural networks are then described.
Original language | English (US) |
---|---|
Article number | 9238476 |
Pages (from-to) | 185-204 |
Number of pages | 20 |
Journal | IEEE Open Journal of Circuits and Systems |
Volume | 1 |
DOIs | |
State | Published - 2020 |
Bibliographical note
Publisher Copyright:© 2020 IEEE.
Keywords
- Boltzmann machines
- Hopfield neural network
- Perceptron
- convolutional neural network
- graph neural networks
- hyperdimensional computing
- multi-layer perceptron
- recurrent neural network
- spiking neural networks