2006-07-23Z2011-03-11T08:56:31Zhttp://cogprints.org/id/eprint/4990This item is in the repository with the URL: http://cogprints.org/id/eprint/49902006-07-23ZRobot Gesture Generation from Environmental
Sounds Using Inter-modality MappingWe propose a motion generation model in
which robots presume the sound source of an
environmental sound and imitate its motion.
Sharing environmental sounds between humans
and robots enables them to share environmental
information. It is difficult to transmit
environmental sounds in human-robot
communications. We approached this problem
by focusing on the iconic gestures. Concretely,
robots presume the motion of the
sound source object and map it to the robot
motion. This method enabled robots to imitate
the motion of the sound source using
their bodies.Yuya HattoriHideki KozimaKazunori KomataniTetsuya OgataHiroshi G. Okuno