Each Role of Short-term and Long-term Memory in Neural Networks
American Journal of Neural Networks and Applications
Volume 6, Issue 1, June 2020, Pages: 10-15
Received: Feb. 20, 2020; Accepted: Mar. 10, 2020; Published: Mar. 24, 2020
Views 6      Downloads 6
Author
Seisuke Yanagawa, OptID, Machida, Tokyo, Japan
Article Tools
Follow on us
Abstract
Based on known functions of neuroscience the neural network that performs serial parallel conversion and its inverse transformation is presented. By hierarchy connecting the neural networks, the upper neural network that can process general time sequence data is constructed. The activity of the upper neural networks changes in response to the context structure inherent in the time series data and have both function of accepting and generating of general time series data. Eating behavior in animals in the early stages of evolution is also processing time series data, and it is possible to predict behavior although be limited short term by learning the contextual structure inherent in time series data. This function is the behavior of so-called short-term memory. Transition of the activation portion in this type of operation is illustrated. Although status of nervous system of the animal change according to the recognition by sensory organ and to the manipulation of the object by muscle in the vicinity of the animal itself, the evolved animals have in addition another nervous system so-called long-term memory or episodic memory being involved experience and prediction. The nervous system of long-term memory behaves freely but keeping consistency of the change in the environment. By the workings of long-term memory, lot of information are exchanged between fellows, and lot of time series data are conserved by characters in human society. In this paper, the model of the transfer of data between different nervous systems is shown using the concept of category theory.
Keywords
Short-term Memory, Long-term Memory, Serial Parallel Conversion, Parallel Serial Conversion, Mirror Neuron, Prediction, Category Theory
To cite this article
Seisuke Yanagawa, Each Role of Short-term and Long-term Memory in Neural Networks, American Journal of Neural Networks and Applications. Vol. 6, No. 1, 2020, pp. 10-15. doi: 10.11648/j.ajnna.20200601.12
Copyright
Copyright © 2020 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
References
[1]
S. Yanagawa, “Neural Network That Learns Sequential Processing and Predicts by the Context.” In The 28th Annual Conference of the Japanese Neural Network Society (JNNS2018).
[2]
S. Yanagawa “From Bacteria, a Consideration of the Evolution of Neural Network” Journal of Mechanics Engineering and Automation 1 (2019) p. 17-23.
[3]
A. Damasio The Strange Order of Things Pantheon Books, New York, 2018 p. 62.
[4]
T M. Iacoboni, “Mirroring People.” Picador 33. 2008, Chapter 4, p 106.
[5]
T. Leinster “Basic Category Theory” Cambridge University Press 2017 p. 10.
[6]
D. C. Dennet, “From Bacteria to Bach and Back: The Evolution of Minds.” In Penguin Books, 2018.
[7]
S. Yanagawa “A neural network that processes time series data in which element data is linked to symbols Third International Workshop on Symbolic-Neural Learning (SNL-2019) Poster presentation.
[8]
S. Yanagawa. 2017, “Learning Machine That Uses Context Structure to Search Policy.” https://jsai.ixsq.nii.ac.jp/.SIG-AGI-007-07.
[9]
S. Dehaene, Consciousness and Brain Penguin Books, 2014.
[10]
W. J. Freeman “How Brains Make up Their Minds” Weidenfeld & Nicolson Ltd. 1999.
[11]
Hoon Keng Poon, Wun-She Yap, Yee-Kai Tee, Wai-Kong Lee, Bok-Min Goi “Hierarchical gated recurrent neural network with adversarial and virtual adversarial training on text classification”, Neural Networks 119 (2019) 299-312.
[12]
L. Andrew Coward, Ron Sun, “Hierarchical approaches to understanding consciousness”, Neural Networks 20 (2007) 947-954.
[13]
Qianli Ma, Wanqing Zhuang, Lifeng Shen, Garrison W. Cottrell, “Time series classification with Echo Memory Networks”, Neural Networks 117 (2019) 225-239.
[14]
Paolo Arena, Marco Cali, Luca Patane, Agnese Portera, Roland StraussSC, “Modelling the insect Mushroom Bodies: Application to Sequence learning”, Neural Networks 67 (2015) 37-53.
[15]
Michael j. Healy, Thomas P. Caudill, “Episodic memory: A hierarchy of spatiotemporal concepts”, Neural Networks 120 (2019) 40-57
[16]
Kenji Doyaa & Tadahiro Taniguchi, “Toward Evolutionary and Developmental Intelligence”, Current Opinion in Behavioral Sciences Volume 29, October 2019, Pages 91-96.
[17]
Ann Sizemore, Chad Giusti, Ari Kahn, Richard F. Betzel, Danielle S. Bassett “Cliques and Cavities in the Human Connectome”, Cornell University arxiv.org/abs/1608.03520, 2016.
[18]
Hiroshi Yamakawa, “Attentional Reinforcement Learning in the Brain”, New Generation Computing, January 2020.
ADDRESS
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
U.S.A.
Tel: (001)347-983-5186