Boltzmann Machine and Hyperbolic activation function in Higher Order Neuro Symbolic Integration
Muraly Velavan, Zainor Ridzuan bin Yahya, Mohamad Nazri bin Abdul Halif, Saratha Sathasivam
Volume 1: Issue 2, Revised on – 30 June 2020, pp 63-69
Author's Information
Muraly Velavan1
Corresponding Author
1Institute of Engineering Mathematics, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
dsmuraly@yahoo.com
Zainor Ridzuan bin Yahya1
1Institute of Engineering Mathematics, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
Mohamad Nazri bin Abdul Halif2
2School of Microelectronic Engineering, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
Saratha Sathasivam3
2School of Mathematical Sciences, Universiti Sains Malaysia Penang, 11800 USM, Malaysia
Abstract:-
Higher-order network structure isimportant in doing higher order programming because high-order neural networks have converge faster and have a higher memory and story capacity. Furthermore higher order networks also have higher approximation ability and robust if compare lower-order neural networks. Thus, the higher-order clauses for logic programming in Hopfield Networks are been focused in this paper. We will limit till fifth order network due to complexity issue. Hereby we employed Boltzmann Machines and hyperbolic tangent activation function to increased the performance of neuro symbolic integration. We used agent based modelling to model this problem.Index Terms:-
Boltzmann machine, agent based modelling and hyperbolic tangent activation functionREFERENCES
[1] J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings National Academy of Science USA, Vol. 79, 1982, pp. 2554-2558, 1982.[2] Pinkas, G., Energy minimization and the satisfiability of propositional calculus. Neural Computation, 3, pp 282-291, 1991.
[3] Pinkas, G., Propositional no monotonic reasoning and inconsistency in symmetric neural networks. Proceedings of the 12th International Joint Conference on Artificial Intelligence, pp. 525-530, 1991.
[4] Wan Abdullah, W.A.T., Logic Programming on a Neural Network.Int .J. Intelligent Sys, 7, pp. 513-519, 1992.
[5] W. A. T. Wan Abdullah., Neural Network Logic. O. Benhar, C. Bosio, P. del Giudice and E.Tabet (eds.), Neural Networks: From Biology toHigh Energy Physics, ETS Editrice, Pisa, pp.135-142, 1991.
[6] Brenton Cooper, Stability analysis of higher-order neural networks for combinatorial optimization, International Journal of Neural Systems, 12, pp. 177-186, 2002.
[7] Ding, Y., Dong, L., Wang, L. and Wu, G., A High Order Neural Network to Solve Crossbar Switch Problem. In: Wong, K.W. et al. (eds.) ICONIP 2010, Part II, LNCS 6444, Heidelberg: RAMEPublishers pp. 692–699, 2010.
[8] Cheung, K.-W, Lee, T., Boundary Detection by Artificial Neural Network.IEEE Proceeding of 1993 International Joint Conference on Neural Networks, Nagoya, 1993.2, pp. 1189-1194, 1993.
[9] Kwok-wai Cheung, Tong Lee, Boundary Detection by Artificial Neural Network. International Joint Conference on Neural Networks, 1993.
[10] Joya, G, Atencia, M.A. and Sandoval, F., Hopfield neural networks for optimization: study of the different dynamics, Neurocomputing 43, Amsterdam: Elsevier Science B.V pp.219-237, 2002.
[11] Roweis S. (n.d.) Boltzmann Machines [Online]. [Accessed 15 May 2013].
Available from: Onine
[12] Sathasivam, S. and Wan Abdullah, W.A.T., The Satisfiabilty Aspect of Logic on Little Hopfield Network.ISSN 1450-223X (7), pp.90-105, 2010.
To view full paper, Download here
To View Full Paper
For authors
Author's guidelines Publication Ethics Publication Policies Artical Processing Charges Call for paper Frequently Asked Questions(FAQS) View All Volumes and IssuesPublishing with



