An Overview of Hopfield Network and Boltzmann Machine
Saratha Sathasivam and Abdu Masanawa Sagir
Volume 1: Issue 1, Revised on – 30 March 2020, pp 25-34
Author's Information
Saratha Sathasivam1
Corresponding Author
1School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia.
saratha@usm.my
Abdu Masanawa Sagir2
2School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia.
Abstract:-
Neural networks are dynamic systems in the learning and training phase of their operations. The two well-known and commonly used types of recurrent neural networks, Hopfield neural network and Boltzmann machine have different structures and characteristics. This study gives an overview of Hopfield network and Boltzmann machine in terms of architectures, learning algorithms, comparison between these two networks from several different aspects as well as their applications.Index Terms:-
Artificial neural network, Boltzmann machine, Hopfield network, learning algorithmsREFERENCES
[1] Hecht-Nielsen, R. The Basic introduction to Neural networks, 1989. Retrived on 17th September, 2014 @Online
[2] Haykin S. Neural Networks: A comprehensive foundation. Prentice Hall, Inc, USA. ISBN 0780334949, 9780780334946, 1999.
[3] Fausett, L.V. Fundamentals of Neural networks: Architectures, algorithms and applications. Prentice-Hall,Inc, New Jersey, U.S.A, 1994. ISBN 0-13-334186-0
[4] Sathasivam, S., & Abdullah, W. A. T. W. Logic learning in Hopfield networks. arXiv preprint arXiv:0804.4075, 2008.
[5] Hopfield, J. J., & Tank, D. W. “Neural” computation of decisions in optimization problems. Biological cybernetics, 52(3), 141-152, 1985.
[6] Ackley, D.H., Hinton, G.E. and Sejnowski, T.J. A learning algorithm for Boltzmann machines, Cognitive Science (9), 1985, 147-169.
[7] Sathasivam, S. and Abdullah W. Logic mining in neural network: Reverse analysis method. Computing, 91(2): 2011, P 119-133.
[8] Cochocki A. and Unbehauen R. Neural Networks for Optimization and Signal Processing. John Wiley & Sons, Inc., 1993.
[9] Hamadneh N. S., Sathasivam S. L., Tilahun and Choon O. H. Learning logic programming in radial basis function network via genetic algorithm. J. Appl. Sci., 12(9): 2012, P 840-847.
[10] Galushkin, A.T. Neural Networks theory. RAMEPublishers, Berlin, Germany. xx,396, 2007, p.176 illus. ISBN 978-3-540-48125-6
[11] Pranob, K., Charles, H.K., Rajesh-Kumar, C., Nikhita, N., Santhosh, R., Harish, V., & Swathi, M. Artificial Neural Network Based Image Compression using Levenberg – Marquardt Algorith. International Journal of Modern Engineering Research (IJMER), 1(2), 2012, pp. 482 – 489.
[12] Genevieve, O., Schraudolph, N. and Cummins F. Lecture Notes on Neural Networks, (1999). Willamette University, Portland. Retrieved on 2nd October, 2014 @
Online
[13] Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8), 1982, 2554-2558.
[14] Hopfield, J. J. Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the national academy of sciences, 81(10), 1984, 3088-3092.
[15] Geoffrey E. Hinton, Simon Osindero, and Yee-Whye. A fast learning algorithm for deep belief nets. Neural Computation, 18(7):1527–1554, 2006.
[16] Sathasivam, S. Improving Logic Programming in Hopfield Network with Sign Constrained. SSJ, 1(2), 2, 2009.
[17] Zeng, X., & Martinez, T. R. Improving the performance of the Hopfield network by using a relaxation rate. In Artificial Neural Nets and Genetic Algorithms, 1999, January, pp. 67-72. RAMEPublishers Vienna.
[18] Joya, G., Atencia, M. A., & Sandoval, F. Hopfield neural networks for optimization: study of the different dynamics. Neurocomputing, 43(1), 2002, 219-237.
[19] Glilblad, D. Feedback Networks and Hopfield Networks, 2008. Retrieved on 30thSeptember, 2014 @
Online
[20] Barra, A., Bernacchia, A., Santucci, E., & Contucci, P. On the equivalence of Hopfield networks and Boltzmann Machines. Neural Networks, 2012, 34, 1-9.
[21] Huang, H. Reconstructing the Hopfield network as an inverse Ising problem. Physical Review E, 81(3), 036104. 2010.
[22] Rabunal, J. R., & Dorado, J. (Eds.). Artificial neural networks in real-life applications. IGI Global. 2006.
[23] Zhang, M. L., & Zhou, Z. H. Adapting RBF neural networks to multi-instance learning. Neural Processing Letters, 23(1), 2006, 1-26.
[24] Sathasivam, S., Hamadneh, N., & Choon, O. H. Comparing Neural Networks: Hopfield Network and RBF Network. Applied Mathematical Sciences, 5(69), 2011, 3439-3452.
[25] Storkey, A. "Increasing the capacity of a Hopfield network without sacrificing functionality." Artificial Neural Networks – ICANN'97, 1997: 451-456.
[26] Morris, R. G. DO Hebb: The Organization of Behavior, Wiley: New York; 1949. Brain research bulletin, 50(5-6), 1998, 437-437.
[27] Cheung, K. F., Atlas, L. E., & Marks II, R. J. Synchronous vs asynchronous behavior of Hopfield’s CAM neural net. Applied Optics, 26(22), 1987, 4808-4813.
[28] Sathasivam, S., & Fen, N. P. Developing Agent Based Modelling for Doing Logic Programming in Hopfield Network. Applied Mathematical Sciences, 7(1), 2013, 23-35.
[29] Hu, J. E., & Siy, P. The ordering-oriented Hopfield network. In Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference, Vol. 7, 1994, pp. 4693-4698). IEEE.
[30] Du, K. L., & Swamy, M. N. Neural networks in a softcomputig framework, 2006, pp. 207 – 215. London: RAMEPublishers.
[31] Aarts, E. H., & Korst, J. H. Boltzmann machines and their applications. In PARLE Parallel Architectures and Languages Europe , 1987, January), pp. 34-50. RAMEPublishers Berlin Heidelberg.
To view full paper, Download here