Control Systems and Computers, N4, 2018, Article 1

DOI: https://doi.org/10.15407/usim.2018.04.003

Upr. sist. maš., 2018, Issue 4 (276), pp. 3-20.

UDC 004.8 + 004.032.26

Goltsev Alexander D., Doctor of Technical Sciences, head of the department, E-mail: root@adg.kiev.ua,

Gritsenko Volodymyr I., Corresponding member of the Ukrainian academy of sciences, Director, E-mail: vig@irtc.org.ua

International Research and Training Center for Information Technologies and Systems of the NAS and MES of Ukraine, Glushkov ave., 40, Kyiv, 03187, Ukraine

Neural Network Technologies in the Problem of Handwriting Recognition

Introduction the “Department of Neural Network Information Processing Technologies” of the “International Scientific and Educational Center for Information Technologies and Systems” is the heir to the “Department of Biological and Medical Cybernetics”, which was organized by Academician Amosov in 1962. Currently, the goal of research is to develop effective neural network information processing technologies based on computer simulation of the neural organization of the human brain and the mechanisms of its thinking. The developed neural network technologies are intended for the use in solving actual practical problems related to the field of Artificial Intelligence.

The purpose of the article is to describe some of the department’s work related to the field of image recognition and classification, in particular, the task of character image recognition.

Methods – basic definitions, recommendations and conclusions are based on the analysis of the results of own research.

Results – in the article, based on the results of a series of experiments comparing the LiRA classifier and the modular neural network, it is shown that the latest version of the modular neural network has a higher efficiency (recognition ability) than the LiRA classifier, although it is slightly inferior in speed.

Conclusion – the neurobiological relevance of the LiRA classifier and the modular neural network opens up the possibility of creating on their basis intelligent information technologies that function similarly to the human brain.

Download full text! (On Russian).

Keywords: neurons, neural layers, trained connections, LiRA-features, inhibitory connections.

1) Amosov, M., 1967. Modelling of Thinking and the Mind. New York: Spartan Books, 192 p.
https://doi.org/10.1007/978-1-349-00640-3

2) Amosov, N.M., Baidyk, T.N., Goltsev, A.D., Kasatkin, A.M., Kasatkina, L.M., Rachkovsky, D.A., 1991. Neurocomputers and intelligent robots. Kiev: Naukova Dumka, 269 p. (In Russian).

3) Kasatkina L.M., Kasatkin A.M., Goltsev A.D., Rachkovsky D.A. Implementation of the ideas of Acad. N.M. Amosova in neural network information technologies.Kibernetika i vycislitelnaa tehnika. 2013, 174. pp. 18–29. (In Russian).

4) Gritsenko, V.I., Rachkovskij, D.A., Goltsev, A.D., Lukovych, V.V., Misuno, I.S., Revunova, E.G., Slipchenko, S.V., Sokolov, A.M., Talayev, S.A., 2013. “Neural distributed representation for intelligent information technologies and modeling of thinking”. Cybernetics and Computer Engineering, V. 173, pp. 7–24. (in Russian).

5) Gritsenko V.I., Rachkovskij D.A., Revunova E.G. Neural distributed representations of vector data in intelligent information technologies. Cybernetics and Computer Engineering. 2018. N 4(194).

6) Kussul, E.M., 1992. Associative neural-like structures. Kiev: Naukova Dumka, 144 p. (In Russian).

7) Misuno I.S., Rachkovskij D.A., Slipchenko S.V., Sokolov A.M. Searching for text information with the help of vector representations. Problems of Programming. 2005. N. 4. P. 50–59. (in Russian).

8) Kleyko, D., Rahimi, A., Rachkovskij, D., Osipov, E., Rabaey, J., 2018. “Classification and recall with binary hyperdimensional computing: Tradeoffs in choice of density and mapping characteristics”. IEEE Transactions on Neural Networks and Learning Systems, DOI 10.1109/TNNLS.2018.2814400.
https://doi.org/10.1109/TNNLS.2018.2814400

9) Kussul, E.M., Baidyk, T.N., Lukovich, V.V., Rachkovskij, D.A., 1993. “Adaptive neural network classifier with multifloat input coding”. The 6th International Conference Neural Networks & their Industrial & Cognitive Applications “Neuro-Nimes’93”, Nîmes, France, October 25-29, pp. 209–216.

10) Lukovich, V.V., Goltsev, A.D., Rachkovskij, D.A., 1997. “Neural network classifiers for micromechanical equipment diagnostics and micromechanical product quality inspection”. The 5th European Congress on Intelligent Techniques and Soft Computing “EUFIT’97”. Aachen, Germany, September 8 – 11, pp. 534–536.

11) Kussul, E.M., Kasatkina, L.M., Rachkovskij, D.A., Wunsch, D.C., 1998. “Application of random threshold neural networks for diagnostics of micro machine tool condition”. IEEE International Joint Conference on Neural Networks “IJCNN’01”. Anchorage, Alaska, USA, May 4 – 9, pp. 241–244.
https://doi.org/10.1109/IJCNN.1998.682270

12) Frolov, A.A., Husek, D., Rachkovskij, D.A., 2006. “Time of searching for similar binary vectors in associative memory”. Cybernetics and Systems Analysis, 42 (5), pp. 615–623.
https://doi.org/10.1007/s10559-006-0098-z

13) Gritsenko V.I., Rachkovskij D.A., Frolov A.A., Gayler R., Kleyko, D., Osipov, E., 2017. “Neural distributed autoassociative memories: A survey”. Cybernetics and Computer Engineering, 2(188), pp. 5–35.

14) Frady, E. P., Kleyko, D., Sommer, F.T., 2018. “A theory of sequence indexing and working memory in recurrent neural networks”. Neural Computation, 30 (6), pp. 1449–1513.
https://doi.org/10.1162/neco_a_01084

15) Rachkovskij, D.A., 2014. “Vector data transformation using random binary matrices”. Cybernetics and Systems Analysis, 50 (6), pp. 960–967.
https://doi.org/10.1007/s10559-014-9687-4

16) Rachkovskij, D.A., 2015. “Formation of similarity-reflecting binary vectors with random binary projections”. Cybernetics and Systems Analysis, 51 (2), pp. 313–323.
https://doi.org/10.1007/s10559-015-9723-z

17) Ferdowsi, S., Voloshynovskiy, S., Kostadinov, D., Holotyak, T., 2016. “Fast content identification in highdimensional feature spaces using sparse ternary codes”. The 8th IEEE International Workshop on Information Forensics and Security (WIFS’16). Abu Dhabi, UAE, December 4-7, pp. 1–6.

18) Kleyko, D., Osipov, E., Rachkovskij, D.A., 2016. “Modification of holographic graph neuron using sparse distributed representations”. Procedia Computer Science, 88, pp. 39–45.
https://doi.org/10.1016/j.procs.2016.07.404

19) Sokolov, A., Rachkovskij, D., 2005. “Approaches to sequence similarity representation”. Information Theories and Applications, 13 (3), pp. 272–278.

20) Kanerva, P., 2009. “Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors”. Cognitive Computation, l (2), pp. 139–159.
https://doi.org/10.1007/s12559-009-9009-8

21) Slipchenko, S. V., Rachkovskij, D.A., 2009. “Analogical mapping using similarity of binary distributed representations”. Information Theories and Applications, 16 (3), pp. 269–290.

22) Gallant, S. I., Okaywe, T.W., 2013. “Representing objects, relations, and sequences”. Neural Computation, 25 (8), pp. 2038–2078.
https://doi.org/10.1162/NECO_a_00467

23) Revunova, E.G., Rachkovskij, D.A., 2009. “Using randomized algorithms for solving discrete ill-posed problems”. Information Theories and Applications, 16 (2), pp. 176–192.

24) Rachkovskij, D.A., Revunova, E.G., 2012. “Randomized method for solving discrete ill-posed problems”. Cybernetics and Systems Analysis, 48 (4), pp. 621–635.
https://doi.org/10.1007/s10559-012-9443-6

25) Revunova, E.G., Rachkovskij, D.A., 2018. “Random projection and truncated SVD for estimating direction of arrival in antenna array”. Cybernetics and Computer Engineering, 3 (193), pp. 5–26.
https://doi.org/10.15407/kvt192.03.005

26) Kussul, E., Baidyk, T., Kasatkina, L. Lukovich, V., 2001. “Rosenblatt perceptrons for handwritten digit recognition”. International Joint Conference on Neural Networks “IJCNN’01”. Washington, USA, pp. 1516–1521.
https://doi.org/10.1109/IJCNN.2001.939589

27) Kussul, E., Baidyk, T., 2004. “Improved method of handwritten digit recognition tested on MNIST database”. Image and Vision Computing, 22, pp. 971–981.
https://doi.org/10.1016/j.imavis.2004.03.008

28) Kussul, E., Baidyk, T., 2006. “LIRA neural classifier for handwritten digit recognition and visual controlled microassembly”. Neurocomputing, 69(16–18), pp. 2227–2235.
https://doi.org/10.1016/j.neucom.2005.07.009

29) Makeyev, O., Sazonov, E., Baidyk, T., Martin, A., 2008. “Limited receptive area neural classifier for texture recognition of mechanically treated metal surfaces”. Neurocomputing, 71 (7-9), pp. 1413.
https://doi.org/10.1016/j.neucom.2007.05.004

30) Kussul, E., Baidyk, T., Wunsch, D., 2010. Neural Networks and Micro Mechanics. Springer, 221 p.
https://doi.org/10.1007/978-3-642-02535-8

31) Cristianini, N., Shawe-Taylor, J., 2000. An Introduction to Support Vector Machines (and other Kernel-Based Learning Methods). Cambridge University PressNew York, NY, USA.
https://doi.org/10.1017/CBO9780511801389

32) Schlesinger, M.I., Kalmykov, V.G., Sukhorukov, A.A., 1981. “Comparative analysis of algorithms that synthesize a linear solution for the analysis of complex hypotheses.” Automation. 1981. pp. 3–9.

33) Schlesinger, M.I., Hlavac, V., 2002. Ten Lectures on Statistical and Structural Pattern Recognition. Dordrecht: Kluwer Academic Publishers, 522 p.
https://doi.org/10.1007/978-94-017-3217-8

34) Franc, V., Hlavác, V., 2003. “An iterative algorithm learning the maximal margin classifier”. Pattern Recognition, 36 (9), pp. 1985–1996.
https://doi.org/10.1016/S0031-3203(03)00060-8

35) Baidyk, T, Kussul, E., Makeyev, O., Vega, A., 2008. “Limited receptive area neural classifier based image recognition in micromechanics and agriculture”. International Journal of Applied Mathematics and Informatics, 2 (3), pp. 96–103.

36) Baydyk, T., Kussul, E., Hernández Acosta, M., 2012. “New Application of LIRA neural network”. The 16th WSEAS International Conference on Circuits. Greece, Kos Island, pp. 115–119.

37) Kasatkina, L.M., Lukovich, V.V., Pilipenko, V.V., 2006. “Personality recognition by voice using the LIRA classifier”. Upravlausie sistemy i masiny, 3, pp. 67–73. (In Russian).

38) Hebb, D.O., 1949. The Organization of Behavior. New York, USA: John Wiley & Sons Inc., 335 p.

39) Goltsev, A.D., 1991. “Structured neural networks with learning for texture segmentation in images”. Cybernetics and Systems Analysis. Plenum Publishing Corporation. New York, USA, 27 (6), pp. 927–936.

40) Goltsev, A., 1996. “An assembly neural network for texture segmentation”. Neural Networks, 9 (4), pp. 643–653.
https://doi.org/10.1016/0893-6080(95)00136-0

41) Goltsev, A., Wunsch, D.C., 1998. “Inhibitory connections in the assembly network for texture segmentation”. Neural Networks, 11 (5), pp. 951–962.
https://doi.org/10.1016/S0893-6080(98)00053-7

42) Goltsev, A., Wunsch, D.C., 2004. “Generalization of features in the assembly neural networks”. International Journal of Neural Systems (IJNS), 14 (1), pp. 39–56.
https://doi.org/10.1142/S0129065704001838

43) Goltsev, A., 2004. “Secondary learning in the assembly neural network”. Neurocomputing, 62, pp. 405–426.
https://doi.org/10.1016/j.neucom.2004.06.001

44) Goltsev, A., Rachkovskij, D., 2005. “Combination of the assembly neural network with a perceptron for recognition of handwritten digits arranged in numeral strings”. Pattern Recognition, 38 (3), pp. 315–322.
https://doi.org/10.1016/j.patcog.2004.09.001

45) Goltsev, AD, 2005. Neural networks with ensemble organization. Kiev: Naukova Dumka, 200 p. (In Russian).

46) Goltsev, A., Gritsenko, V., 2009. “Modular neural networks with Hebbian learning rule”. Neurocomputing, 72 (10–12), pp. 2477–2482.
https://doi.org/10.1016/j.neucom.2008.11.011

47) Goltsev, A., Gritsenko, V., 2009. “Investigation of efficient features for image recognition by neural networks”. Neural Networks, 28, pp. 15–23.
https://doi.org/10.1016/j.neunet.2011.12.002

48) Goltsev, A., Gritsenko, V., 2015. “Modular neural networks with radial neural columnar architecture”. Biologically Inspired Cognitive Architectures, 13, pp. 63–74.
https://doi.org/10.1016/j.bica.2015.06.001

49) LeCun The MNIST database of handwritten digits. [online] Available at: <http://yann.lecun.com/exdb/mnist/> [Accessed 16 Oct. 2018].

Received 22.10.18