Enhanced robotic hand-eye coordination inspired from human-like behavioral patterns

Authors Organisations
  • Fei Chao(Author)
    Xiamen University
  • Zuyuan Zhu(Author)
    University of Essex
  • Chih-Min Lin(Author)
    Xiamen University
  • Huosheng Hu(Author)
    Xiamen University
  • Longzhi Yang(Author)
    Northumbria University
  • Changjing Shang(Author)
  • Changle Zhou(Author)
    Xiamen University
Type Article
Original languageEnglish
Pages (from-to)384-396
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume10
Issue number2
DOI
Publication statusPublished - 18 Oct 2016
Links
Permanent link
Show download statistics
View graph of relations
Citation formats

Abstract

Robotic hand-eye coordination is recognized as an important skill to deal with complex real environments. Conventional robotic hand-eye coordination methods merely transfer stimulus signals from robotic visual space to hand actuator space. This paper introduces a reverse method: Build another channel that transfers stimulus signals from robotic hand space to visual space. Based on the reverse channel, a human-like behavior pattern: “Stop-to-Fixate”, is imparted to the robot, thereby giving the robot an enhanced reaching ability. A visual processing system inspired by the human retina structure is used to compress visual information so as to reduce the robot’s learning complexity. In addition, two constructive neural networks establish the two sensory delivery channels. The experimental results demonstrate that the robotic system gradually obtains a reaching ability. In particular, when the robotic hand touches an unseen object, the reverse channel successfully drives the visual system to notice the unseen object.

Keywords

  • constructive neural network, Robotic hand-eye coordination, sensory motor reverse mapping, human-like behavioural pattern