Servicios Personalizados
Revista
Articulo
Indicadores
- Citado por SciELO
- Accesos
Links relacionados
- Similares en SciELO
Compartir
Revista mexicana de física
versión impresa ISSN 0035-001X
Rev. mex. fis. vol.54 supl.1 México feb. 2008
Learning limits of an artificial neural network
J.J. Vegaª, R. Reynosoª, and H. Carrillo Calvetb
ª Departamento del Acelerador, Gerencia de Ciencias Ambientales, Instituto Nacional de Investigaciones Nucleares Apartado Postal 181027, México D.F. 11801, México.
b Laboratorio de Dinámica no Lineal, Facultad de Ciencias, Universidad Nacional Autónoma de México, México, D.F. 04510.
Recibido el 14 de mayo de 2007
Aceptado el 26 de octubre de 2007
Abstract
Technological advances in hardware as well as new computational paradigms give us the opportunity to apply digital techniques to Pulse Shape Analysis (PSA), requiring powerful resources. In this paper, we present a PSA application based on Artificial Neural Networks (ANNs). These adaptive systems offer several advantages for these tasks; nevertheless it is necessary to face the particular problems linked to them as: the selection of the learning rule and the ANN architecture, the sizes of the training and validation data sets, overtraining, the effect of noise on the pattern identification ability, etc. We will present evidences of the effect on the performance of a backpropagation ANN as a pattern identifier of both: the size of the noise that the Bragg curve spectrometer signal present and of overtraining. In fact, these two effects are related.
Keywords: Neural networks; Bragg curve spectroscopy; digital pulseshape analysis; pattern identification.
Resumen
Los avances tecnológicos del hardware lo mismo que los nuevos paradigmas computacionales brindan la oportunidad de aplicar técnicas digitales al Análisis de Forma de Pulsos (PSA), lo cual requiere de recursos poderosos. En este trabajo, se presenta una aplicación de PSA basada en Redes Neuronales Artificiales (ANNs). Estos sistemas adaptivos ofrecen varias ventajas para estas tareas; sin embrago es necesario enfrentar los problemas particulares asociados a ellos como: la selección de la ley de aprendizaje y de la arquitectura de la ANN, los tamaños de los conjuntos de datos de entrenamiento y de validación, el sobreentrenamiento, el efecto del ruido sobre la habilidad para identificar patrones, etc. Se presentarán evidencias del efecto sobre el rendimiento de una ANN de retropropagación como reconocedor de patrones del: tamaño del ruido que la señal de un espectrómetro de curva de Braga presenta así como del sobreentrenamiento. De hecho, estos dos efectos están relacionados.
Descriptores: Redes neuronales; espectroscopía de curva de Braga; análisis digital de forma de pulsos; identificación de patrones.
PACS: 07.05.Kf; 07.05.Mh; 29.40.Cs
DESCARGAR ARTÍCULO EN FORMATO PDF
References
1. J.J. Vega and R. Reynoso, Nucl. Instr. and Meth. B243 (2006) 232. [ Links ]
2. J.J. Vega and R. Reynoso, Rev. Mex. Fís. S 53 (3) (2007) 118. [ Links ]
3. T. Kihm, V.F. Bobrakov and H.V KlapdorKleingrothaus, Nucl. Instr. and Meth. A 498 (2003) 334. [ Links ]
4. T. N. Ginter, Ph. D. Dissertation, Vanderbilt University, Nashville, Tennessee, (1999). [ Links ]
5. C.J. Gross et al., Nucl. Instr. and Meth. A 450 (2000) 12. [ Links ]
6. J. Hellmig and H.V. KlapdorKleingrothaus, Nucl. Instr. and Meth. A 455 (2000) 638. [ Links ]
7. J. Hellmig, F. Petry and H.V. KlapdorKleingrothaus, [ Links ] Patent DE19721323A.
8. B. Majorovits and H.V. KlapdorKleingrothaus, Eur. Phys. J. A 6 (1999)463. [ Links ]
9. L. Andronenko et al., Preprint PNPI NP31998 Nr. 2217. [ Links ]
10. N.J. Shenhav and H. Stelzer, Nucl. Instr. and Meth. 228 (1985) 359364. [ Links ]
11. M.N. Andronenko and W. Neubert, Annual report 19981999 FZR271 (1999) 67. [ Links ]
12. A. Moroni et al., Nucl. Instr. and Meth. 225 (1984) 57. [ Links ]
13. M.F. Vinyard et al., Nucl. Instr. and Meth. A 255 (1987) 507. [ Links ]
14. K.E. Rhem and F.L. Wolfs, Nucl. Instr. and Meth. A 273 (1988) 262. [ Links ]
15. J.J. Vega, J.J. Kolata, W. Chung, D.J. Henderson and C.N. Davids, Proc. XIV Symposium on Nuclear Physics, Cuernavaca, México, 1991, M. Brandan, (ed., World Scientific, Singapore, 1991) 221. [ Links ]
16. C.M. Bishop, Neural Networks for Pattern Recognition, Clarendon PressOxford, (1995). [ Links ]
17. C.M. Bishop, IEEE Transactions of Neural Networks 4 (1993) 882. [ Links ]
18. C. Wang, S.S. Venkatesh and J.S. Judd, NIPS' 1993 6 eds. J. Cowan, G. Tesauro, J. Alspector, MorganKaufmann, (1994) 303. [ Links ]
19. W.S. Sarle, in Proc. of the 27 th Symposium on the Interface of Computing Science and Statistics (1995) 352360. [ Links ]
20. A.S. Weigend, M. Mangeas and A.N. Srivastava, International Journal of Neural Systems 6 (1995) 373. [ Links ]
21. C.M. Bishop, Neural Computation 7 (1995) 108. [ Links ]
22. S. Bös, ICANN' 1995 2, ed. by EC2 & Cie, 111. [ Links ]
23. S. Bös, NIPS' 1995 8, eds. G. D. Touretzky, M. Mozer, M. Hasselmo, MIT Press, (1996) 218. [ Links ]
24. P. Sollich and A. Krogh, in Advances in Neural Information Processing Systems 8, eds. D. S. Touretzky, M. C. Mozer and M. E. Hasselmo, MIT Press, (1996) 190. [ Links ]
25. S. Amari, N. Murata, K.R. Finke, M. Finker and H. Yang, IEEE Transaction on Neural Networks 8 (1997) 985; [ Links ] and NIPS' 1995 8, eds. G. D. Touretzky, M. Mozer, M. Hasselmo, MIT Press, (1996) 190. [ Links ]
26. S. Amari and N. Murata, IWANN' 1997, eds. J. Mira, R. MorenoDiaz, J. Cabestany, Springer, (1997) 284. [ Links ]
27. S. Lawrence and C. L. Giles, IJCNN' 00 Vol. 1, eds. ShunIchi Amari, C. Lee Giles, Marco Gori, and Vincenzo Piuri, IEEE Press, (2000) 114. [ Links ]
28. P. Domingos, ICML' 2000, Morgan Kaufman, (2000) 223. [ Links ]
29. G. N. Karystinos and D.A. Pados, IEEE Transactions on Neural Networks 11 (2000) 1050. [ Links ]
30. R. Caruana, S. Lawrence and C.L. Giles, NIPS' 2000 Vol. 13, eds. T. Leen, T. Dietterich, V. Tresp, MIT Press, (2001) 28. [ Links ]
31. A. Zell et al., SNNS Stuttgart Neural Network Simulator, version 4.2, University of Stuttgart, Institute for Parallel and Distributed High Performance Systems; and University of Tubingen, WilhemSchickardInstitute for Computer Science, (1998). [ Links ]
32. J. Damgov and L. Litov, Nucl. Instr. and Meth. A 482 (2002) 776. [ Links ]
33. M. Ambrosio et al., (The MACRO Collaboration), Nucl. Instr. and Meth. A 492 (2002) 376. [ Links ]
34. E. Yoshida, K. Shizuma, S. Endo and T. Oka, Nucl. Instr. and Meth. A 484 (2002) 557. [ Links ]
35. R. HechtNielsen, Neurocomputing, AddisonWesley Publishing Company, (1990). [ Links ]