Neural Networks in the Modern Age

  • Ivan Gushchin Харківський національний університет імені В. Н. Каразіна, майдан Свободи 4, м. Харків-22, Україна; 61022 http://orcid.org/0000-0002-1917-716X
  • Volodymyr Kuklin Харківський національний університет імені В. Н. Каразіна, майдан Свободи 4, м. Харків-22, Україна; 61022 https://orcid.org/0000-0002-0310-1582
  • Alex Mishyn Харківський національний університет імені В. Н. Каразіна, майдан Свободи 4, м. Харків-22, Україна; 61022 http://orcid.org/0000-0001-7478-757X
Keywords: neural networks, perceptron, learning with a teacher, learning without a teacher, deep learning, error back propagation, matrix convolution, layer-by-layer learning, gradient descent, activation function

Abstract

The idea to apply representations about the work of the human brain as mathematical models, which can be used in a variety of applied tasks is shown. It is shown that mathematical and physiological models have quite little in common, but the basic idea - to apply neurons as some independent computational nodes and combine them into layers - has been developed to the current state of technology. The historical background of the development of neural network science is given and the main researchers, whose works have influenced the vector and the pace of technology development the most, are mentioned. It is shown that neural networks had different support from investors during their development and the peak of mass interest in them depended on the emergence of necessary computing power or a breakthrough network architecture. Such networks were in its time perceptrons, networks with feedback, networks applying convolution operation for image analysis and classification. It is shown that the so-called deep learning has developed based on weight optimization methods by gradient descent. A review of known solutions of supervised learning, feedback, and language-based learning networks is conducted. Generative models look like the most promising direction in the development of scientific thought and the creation of interpretive solutions based on neural networks. It is shown that in learning with a teacher, which is typical for deep neural networks, the optimization for it is performed by regularization procedures, which help to avoid diversity and provide error minimization during error back propagation

Downloads

Download data is not yet available.

Author Biographies

Ivan Gushchin, Харківський національний університет імені В. Н. Каразіна, майдан Свободи 4, м. Харків-22, Україна; 61022

старший викладач, кафедра штучного інтелекту та програмного забезпечення

Volodymyr Kuklin, Харківський національний університет імені В. Н. Каразіна, майдан Свободи 4, м. Харків-22, Україна; 61022

доктор фізико-математичних наук, професор; завідувач кафедри штучного інтелекту та програмного забезпечення

References

/

References

Published
2021-06-29
How to Cite
Gushchin, I., Kuklin, V., & Mishyn, A. (2021). Neural Networks in the Modern Age. Bulletin of V.N. Karazin Kharkiv National University, Series «Mathematical Modeling. Information Technology. Automated Control Systems», 50, 49-58. https://doi.org/10.26565/2304-6201-2021-50-05
Section
Статті