Comparing Logic Programming in Radial Basis Function Neural Network (RBFNN) and Hopfield Neural Network
Mamman Mamuda and Saratha Sathasivam
Volume 1: Issue 1, Revised on – 30 March 2020, pp 16-24
Author's Information
Mamman Mamuda1
Corresponding Author
1School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia.
maanty123@gmail.com
Saratha Sathasivam2
2School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang, Malaysia.
Abstract:-
Neural network is a black box that clearly learns the internal relations of unknown systems. Neural-symbolic systems are based on both logic programming and artificial neural networks. Radial basis function neural network (RBFNN) and Hopfield neural network are the two well-known and commonly used types of feed forward and feedback networks. This study gives an overview of how logic programming is been carried out on both networks as well as the comparison of doing logic programming on both radial basis neural network and Hopfield neural network.Index Terms:-
Radial basis function neural network, Hopfield network, Logic programmingREFERENCES
[1] S. Sathasivam, N. Hamadneh, O. H. Choon, Comparing neural networks: Hopfield network and rbf network, Applied Mathematical Sciences 5 (2011) 3439–3452.[2] W. W. Abdullah, The connectionist paradigm, Proceedings 1st National Computer Science Conference (1989) 95–111.
[3] W. W. Abdullah, Computationas with neural networks and neural-network-like structures, Computational Techniques and Applications:CTAC, 87 (1988).
[4] J. Moody, C. J. Darken, Fast learning in networks of locally-tuned processing units, Neural computation 1 (1989) 281–294.
[5] N. Hamadneh, S. Sathasivam, O. H. Choon, Higher order logic programming in radial basis function neural network, Appl Math Sci 6 (2012) 115–127
[6] S. Sathasivam, Learning in the recurrent hopfield network, in: Computer Graphics, Imaging and Visualisation, 2008. CGIV’08. Fifth International Conference on, IEEE, pp. 323–328.
[7] S. Noman, S. M. Shamsuddin, A. E. Hassanien, Hybrid learning enhancement of rbf network with particle swarm optimization, in: Foundations of Computational, Intelligence Volume 1, RAMEPublishers, 2009, pp. 381–397.
[8] M. T. Vakil-Baghmisheh, N. Paveˇsi´c, Training rbf networks with selective backpropagation, Neurocomputing 62 (2004) 39–64.
[9] R. Rojas, Neural networks: a systematic introduction, RAMEPublishers, 1996.
[10] D. S. Broomhead, D. Lowe, Radial basis functions, multi-variable functional interpolation and adaptive networks, Technical Report, DTIC Document, 1988
[11] A. Idri, A. Zakrani, A. Zahi, Design of radial basis function neural networks for software effort estimation, IJCSI International Journal of Computer Science Issues 7 (2010).
[12] J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the national academy of sciences 79 (1982) 2554–2558.
[13] S. Sathasivam, W. A. T. W. Abdullah, Logic learning in hopfield networks, arXiv preprint arXiv:0804.4075 (2008).
[14] S. Sathasivam, Neuro-symbolic performance comparison, in: Computer Engineering and Applications (ICCEA),2010 Second International Conference on, volume 1, IEEE, pp. 3–5
[15] S. Sathasivam, W. A. T. W. Abdullah, Logic mining in neural network: reverse analysis method, Computing 91 (2011) 119–133.
[16] A. J. Storkey, R. Valabregue, The basins of attraction of a new hopfield learning rule, Neural Networks 12 (1999) 869–876.
To view full paper, Download here