Dynamic topology generation of an artificial neural network of the multilayer perceptron type
DOI:
https://doi.org/10.17533/udea.redin.343285Keywords:
Artificial neural networks, multi-layer perceptron, topology, architectureAbstract
This paper deals with an approximate constructive method to find architectures of artificial neuronal network (ANN) of the type Multi-Layer Percetron (MLP) which solves a particular problem. This method is supplemented with the technique of the Forced search of better local minima. The training of the net uses an algorithm basic descending gradient (BDG). Techniques such as repetition of the training and the early stopping (cross validation) are used to improve the results. The evaluation approach is based not only on the learning abilities but also on the generalization of the specific generated architectures of a domain. Experimental results are presented in order to prove the effectiveness of the proposed method. These are compared with architectures found by other methods.
Downloads
References
J, Hilera. Redes Neuronales Artificiales. Fundamentos, modelos y aplicaciones. Alfaomega, MADRID 2000. p. 132-153.
K, Peng. “An algorithm to determine neural network hidden layer size and weight coefficients”. Proceedings of the 15th IEEE International Symposium on Intelligent Control (ISIC 2000). Rio Patras. Greece. 2000.
T. Yau. “Constructive Algorithms for structure learning in feedforward Neural Networks for Regression Problems”. IEEE Transactions on Neural Networks. Vol. XX. 1999. p. 16.
S. Ergeziner, E. Thomsen. “An accelerated learning algorthim for multilayer perceptions: optimization layer by layer”. IEEE Trans Neural Network. Vol 6. 1995. p 31-42. DOI: https://doi.org/10.1109/72.363452
F. M. Coetzee, V. L. Stonik. “Topology and geometry of single hidden layer network least square weight solution”. Neural Comput. Vol. 7. 1995. p 672-705. DOI: https://doi.org/10.1162/neco.1995.7.4.672
L. M. Reynery. “Modified backpropagation algorithm for fast learning in neural networks”. Electron Left. Vol. 26. 1990. p. 10-18. DOI: https://doi.org/10.1049/el:19901004
D. H. Park. “Novel fast training algorithm for multiplayer feedforward neural networks”. Electron. Left. Vol 26. 1992. pp. 1-100.
A. Sperduti. Staria A. “Speed up learning and network optimization with extended bacpropagation”. Neural Network, Vol 6. 1993. pp. 365-383. DOI: https://doi.org/10.1016/0893-6080(93)90004-G
T. Ash. “Dynamic node creation in back propagation networks”. Proceedings of Int. Conf. On Neural Networks. San Diego. 1989. pp. 365-375. DOI: https://doi.org/10.1080/09540098908915647
H. Hirose. “Back-propagation algorithm with varies the number of hidden units”. Neural Networks. Vol. 4, 1991. pp. 20-60. DOI: https://doi.org/10.1016/0893-6080(91)90032-Z
S. Aylward, R. Anders. “An algorithm for neural network architecture generation”. A1AAA Computing in Aerospace VIII. Baltimore, 1991. p 30-90. DOI: https://doi.org/10.2514/6.1991-3756
R. Tauel. Neural network with dynamically adaptable neurons. N94-29756, 1994. pp. 1-12
T. Masters, “Practical Neural Networks recipes in C++”. Ed. Academic Press, Inc. 1993. pp. 173-180. DOI: https://doi.org/10.1016/B978-0-08-051433-8.50015-X
X. Yao, “Evolving Artificial Neural Networks. School of Computer Science”. Proceedings IEEE. Septiembre, 1999.
E. Fiesler. “Comparative Bibliography of Ontogenic Neural Networks”. Proccedings of the International Conference on Artificial Neural Networks, ICANN 1994. DOI: https://doi.org/10.1007/978-1-4471-2097-1_188
N. Murata, “Network Information Criterion-Determining the number oh hidden units for an Artificial Neural Network Model”. IEEE Trans. on Neural Networks. Vol. 5. 1994. DOI: https://doi.org/10.1109/72.329683
B. Martín del Brio, Redes Neuronales y Sistemas Difusos. AlfaOmega. 2002. pp. 64-66, 69.
V. Kurková. “Kolmogorov´s theorem and multiplayer neural networks”. Neural Networks. Vol. 5. 1992. pp. 501-506. DOI: https://doi.org/10.1016/0893-6080(92)90012-8
L. Prechelt. “Early Stopping – But When?”. Neural Networks: Tricks of the Trade. 1998. pp. 55-70. DOI: https://doi.org/10.1007/3-540-49430-8_3
D. Hush, B. Horner. “Progress in supervised neural networks: What´s new since Lipmman?”. IEEE Signal Proc. Mag., 1993. DOI: https://doi.org/10.1109/79.180705
L. Hwang, M. Maechler, S. Schimert, “Progression modeling in back-propagation and projection pursuit learning”. IEEE Transactions on Neural Networks. Vol. 5. 1994. pp. 342-353. DOI: https://doi.org/10.1109/72.286906
Downloads
Published
How to Cite
Issue
Section
License
Revista Facultad de Ingeniería, Universidad de Antioquia is licensed under the Creative Commons Attribution BY-NC-SA 4.0 license. https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en
You are free to:
Share — copy and redistribute the material in any medium or format
Adapt — remix, transform, and build upon the material
Under the following terms:
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
NonCommercial — You may not use the material for commercial purposes.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
The material published in the journal can be distributed, copied and exhibited by third parties if the respective credits are given to the journal. No commercial benefit can be obtained and derivative works must be under the same license terms as the original work.
Twitter