Sense of Touch in Robots with Self-Organizing Maps
Author
Summary, in English
We review a number of self-organizing-robot systems that are able to extract features from haptic sensory information. They are all based on self-organizing maps (SOMs). First, we describe a number of systems based on the three-fingered-robot hand, i.e., the Lund University Cognitive Science (LUCS) Haptic-Hand II, that successfully extracts the shapes of objects. These systems explore each object with a sequence of grasps while superimposing the information from individual grasps after cross-coding proprioceptive information for different parts of the hand and the registrations of tactile sensors. The cross-coding is done by employing either the tensor-product operation or a novel self-organizing neural network called the tensor multiple peak SOM (T-MPSOM). Second, we present a system based on proprioception that uses an anthropomorphic robot hand, i.e., the LUCS haptic-hand III. This system is able to distinguish objects both according to shape and size. Third, we present systems that are able to extract and combine the texture and hardness properties from explored materials.
Department/s
Publishing year
2011
Language
English
Pages
498-507
Publication/Series
IEEE Transactions on Robotics
Volume
27
Issue
3
Document type
Journal article
Publisher
IEEE - Institute of Electrical and Electronics Engineers Inc.
Topic
- Computer Vision and Robotics (Autonomous Systems)
Keywords
- Cognitive robotics
- manipulators
- self-organizing feature maps
- tactile sensors
- unsupervised learning
Status
Published
Project
- Ikaros: An infrastructure for system level modelling of the brain
- Thinking in Time: Cognition, Communication and Learning
Research group
- Lund University Cognitive Science (LUCS)
ISBN/ISSN/Other
- ISSN: 1941-0468