Automated cross-modal mapping in robotic eye/hand systems using plastic radial basis function networks

Authors Organisations
Type Article
Original languageEnglish
Pages (from-to)25-52
Number of pages28
JournalConnection Science
Volume19
Issue number1
DOI
Publication statusPublished - 2007
Links
Permanent link No renderer: handleNetPortal,dk.atira.pure.api.shared.model.researchoutput.ContributionToJournal
View graph of relations
Citation formats

Abstract

Advanced autonomous artificial systems will need incremental learning and adaptive abilities similar to those seen in humans. Knowledge from biology, psychology and neuroscience is now inspiring new approaches for systems that have sensory-motor capabilities and operate in complex environments. Eye/hand coordination is an important cross-modal cognitive function, and is also typical of many of the other coordinations that must be involved in the control and operation of embodied intelligent systems. This paper examines a biologically inspired approach for incrementally constructing compact mapping networks for eye/hand coordination. We present a simplified node-decoupled extended Kalman filter for radial basis function networks, and compare this with other learning algorithms. An experimental system consisting of a robot arm and a pan-and-tilt head with a colour camera is used to produce results and test the algorithms in this paper. We also present three approaches for adapting to structural changes during eye/hand coordination tasks, and the robustness of the algorithms under noise are investigated. The learning and adaptation approaches in this paper have similarities with current ideas about neural growth in the brains of humans and animals during tool-use, and infants during early cognitive development.

Keywords

  • Biologically inspired robot learning, Extended Kalman filter, Plasticity in radial-basis function networks, robotics