SciELO - Scientific Electronic Library Online

 
vol.16 número4Modelado y control en espacio de tarea de un manipulador móvil con cancelación de control proporcional-derivativo instalado en fábricaAnálisis de las propiedades del establecimiento de la conexión Bluetooth Bandabase usando Redes de Petri Coloreadas índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Computación y Sistemas

versão On-line ISSN 2007-9737versão impressa ISSN 1405-5546

Resumo

LOPEZ-JUAREZ, Ismael et al. Fast Object Recognition for Grasping Tasks using Industrial Robots. Comp. y Sist. [online]. 2012, vol.16, n.4, pp.421-432. ISSN 2007-9737.

Working in unstructured assembly robotic environments, i.e. with unknown part location; the robot has to accurately not only to locate the part, but also to recognize it in readiness for grasping. The aim of this research is to develop a fast and robust approach to accomplish this task. We propose an approach to aid the learning of assembly parts on-line. The approach which is based on ANN and a reduced set of recurrent training patterns which speed up the recognition task compared with our previous work is introduced. Experimental learning results using a fast camera are presented. Some simple parts (i.e. circular, squared and radiused-square) were used for comparing different connectionist models (Backpropagation, Perceptron and FuzzyARTMAP) and to select the appropriate model. Later during experiments, complex figures were learned using the chosen FuzzyARTMAP algorithm showing a 93.8% overall efficiency and 100% recognition rate. Recognition times were lower than 1 ms, which clearly indicates the suitability of the approach to be implemented in real-world operations.

Palavras-chave : Artificial neural networks; invariant object recognition; machine vision; robotics.

        · resumo em Espanhol     · texto em Inglês

 

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons