Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network
American Journal of Physics and Applications
Volume 2, Issue 4, July 2014, Pages: 88-94
Received: Jul. 13, 2014; Accepted: Jul. 28, 2014; Published: Aug. 10, 2014
Views 3187      Downloads 152
Authors
Said El Yamani, Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco
Samir Zeriouh, Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco
Mustapha Boutahri, Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco
Ahmed Roukhe, Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco
Article Tools
Follow on us
Abstract
In a complex and changing a remote sensing system, which requires taking quick and informed decisions environment, connectionist methods have shown their great contribution in particular the reduction and classification of spectral data. In this context, this paper proposes to study the parameters that optimize the results of an artificial neural network ANN multilayer perceptron based, for classification of chemical agents on multi-spectral images. The mean squared error cost function remains one of the major parameters of the network convergence at its learning phase and a challenge that will face our approach to improve the gradient descent by the conjugate gradient method that seems fast and efficient.
Keywords
Optimizing, Artificial Neural Networks, Classification, Identification, Conjugate Gradient, Multi-Layer Perceptron, Back Propagation of the Gradient
To cite this article
Said El Yamani, Samir Zeriouh, Mustapha Boutahri, Ahmed Roukhe, Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network, American Journal of Physics and Applications. Vol. 2, No. 4, 2014, pp. 88-94. doi: 10.11648/j.ajpa.20140204.11
References
[1]
M. JANATI IDRISS and All, Reducing the number of channels of multi-spectral images by connectionist approach, Signal Processing 17, 2000, pp 491-500.
[2]
A. Guerin, J.H. Crasy, Reconfigurable computing architecture for simulating networks of neurons, Review Signal Processing 5 (3), 1988, pp 178-186.
[3]
J. Proriol, MLP: Network program multi-layer neurons, Journal of Modulad, 1996, pp 24-28.
[4]
S. Lahmiri, A comparative study of back-propagation algorithms in financial prediction, International Journal of Computer Science, Engineering and Applications (IJCSEA), Vol.1, No.4, 2011, pp 15-21.
[5]
L. N. M. Tawfiq Improving Gradient Descent Method for Training Feed Forward Neural Networks, International Journal of Modern Computer Science & Engineering, 2(1), 2013, pp 12-24.
[6]
M. F. MEILLER, A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning, Neural Networks, Vol.6, 1993, pp. 525-533.
[7]
M. Wilamowski, and Hao Yu Improved Computation for Levenberg–Marquardt Training, IEEE transactions on neural networks, vol. 21, no. 6, 2010, pp 930-93.
[8]
N. Qian1, On the momentum term in gradient descent learning algorithms, Neural Networks, 1999, pp 145-151.
[9]
R. S. Ransing and N. M. Nawi1, An improved conjugate gradient based learning algorithm for back propagation neural networks, World Academy of Science, Engineering and Technology, Vol 2, 2008, pp 06-26.
[10]
E.P. van Someren, L.F.A. Wessels, E. Backer, M.J.T. Reinders, Multi-criterion optimization for genetic network modeling, Signal Processing 83, 2003, pp 763-775.
ADDRESS
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
U.S.A.
Tel: (001)347-983-5186