1
Video ID: Nv060 Title: Person Following Robot ApriAttendaTM Authors: Takashi Yoshimi, Manabu Nishiyama, Takafumi Sonoura, Hideichi Nakamoto, Seiji Tokura, Hirokazu Sato, Fumio Ozaki, Nobuto Matsuhira, Hiroshi Mizoguchi Abstract We have developed the person following robot "ApriAttenda(TM)". This robot can accompany a person using vision based target detection and avoid obstacles with ultrasonic sensors. The robot first identifies an individual with its image processing system by detecting a person's region and recognizing the registered color and texture of his/her clothes. Our newly developed algorithm allows the robot to extract a particular individual from a cluttered background, and to find and reconnect with the person if it loses visual contact. Tracking people with vision was realized by systematizing visual and motion control with a robust algorithm that utilizes various characteristics of the image data. ApriAttenda(TM) has been exhibited at Aichi EXPO 2005, and its robust functions and smooth person following capability were successfully demonstrated. Video ID: O v013 Title: Grounded Situation odels: here ords and Percepts eet Authors: Nikolaos Mavridis, Deb Roy Abstract This video illustrates some of the bahavioral capabilities achieved using an implementation of the proposed GSM architecture on the conversational robot Ripley, as introduced in the video, and as further described in the accompanying paper. The situation model acts as a "theatrical stage" in the robot's mind, filled in with present, past or imagined situations. A special tripley-layered design enables bidirectional translation between the sensory data and linguistic descriptions. In the video, you will see the robot: 1) Answering questions and serving motor commands, and verbalising uncertainty: What color are the objects? 2) "Imagining" objects when informed about their existence through language, talking about them without having seen them. Later, you will see him matching his sensory expectations with existing objects, and integrating sensory-derived information about the objects with language-derived. Imagine a blue object on the left! 3) Remembering past events, resolving temporal referents, and answering questions about the past. How big was the blue object when your head started moving? More videos of the system at author's site. Video ID: Oiv017 Title: A Study on the Development of Ubiquitous CellPhone Robot Authors: Seungwoo Kim, Dongik Oh, Dongwook Kim, Yongrae Jung, Jaeil Choe, Jungwook Choi Abstract In this video clip, the development of ubiquitous CPR(Cellular Phone Robot), which adds personal robotic features such as mobility and emotion to CP(Cellular Phone), is presented. We described a prototype implementation of CPR and demonstrated its successful navigational ability. CPR movement are controlled by the data acquired from velocity sensors and the coordinates from RFID tags on the floor. Emotion modules of the CPR Interaction system stimulate human touch-sense through vibrations, olfactory-sense through perfumes, and visual-sense through motion patterns. Biometric signals of human can be effectively measured with ECG and EDR. Experimental results show these easy-to-mount on CPR Equipments are sufficient for generating reliable signal measurements. People will rely more and more on robotic assistants in the near future. Personal robots will lead the way to the new era. CPR provides a stepping stone. IROS2006 Video Digest 3

[IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

  • Upload
    deb

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Video ID: Nv060Title: Person Following Robot ApriAttendaTMAuthors: Takashi Yoshimi, Manabu Nishiyama, Takafumi Sonoura, Hideichi Nakamoto, Seiji Tokura, Hirokazu Sato, Fumio

Ozaki, Nobuto Matsuhira, Hiroshi Mizoguchi

AbstractWe have developed the person following robot "ApriAttenda(TM)". This robot can accompany a person using vision basedtarget detection and avoid obstacles with ultrasonic sensors. The robot first identifies an individual with its image processingsystem by detecting a person's region and recognizing the registered color and texture of his/her clothes. Our newly developedalgorithm allows the robot to extract a particular individual from a cluttered background, and to find and reconnect with theperson if it loses visual contact. Tracking people with vision was realized by systematizing visual and motion control with arobust algorithm that utilizes various characteristics of the image data. ApriAttenda(TM) has been exhibited at Aichi EXPO2005, and its robust functions and smooth person following capability were successfully demonstrated.

Video ID: O v013Title: Grounded Situation odels: here ords and Percepts eet

Authors: Nikolaos Mavridis, Deb Roy

AbstractThis video illustrates some of the bahavioral capabilities achieved using an implementation of the proposed GSM architecture onthe conversational robot Ripley, as introduced in the video, and as further described in the accompanying paper. The situationmodel acts as a "theatrical stage" in the robot's mind, filled in with present, past or imagined situations. A special tripley-layereddesign enables bidirectional translation between the sensory data and linguistic descriptions.In the video, you will see the robot:1) Answering questions and serving motor commands, and verbalising uncertainty: What color are the objects?2) "Imagining" objects when informed about their existence through language, talking about them without having seen them.Later, you will see him matching his sensory expectations with existing objects, and integrating sensory-derived informationabout the objects with language-derived. Imagine a blue object on the left!3) Remembering past events, resolving temporal referents, and answering questions about the past. How big was the blue objectwhen your head started moving?More videos of the system at author's site.

Video ID: Oiv017Title: A Study on the Development of Ubiquitous CellPhone Robot

Authors: Seungwoo Kim, Dongik Oh, Dongwook Kim, Yongrae Jung, Jaeil Choe, Jungwook Choi

AbstractIn this video clip, the development of ubiquitous CPR(Cellular Phone Robot), which adds personal robotic features such asmobility and emotion to CP(Cellular Phone), is presented. We described a prototype implementation of CPR and demonstratedits successful navigational ability. CPR movement are controlled by the data acquired from velocity sensors and the coordinatesfrom RFID tags on the floor. Emotion modules of the CPR Interaction system stimulate human touch-sense through vibrations,olfactory-sense through perfumes, and visual-sense through motion patterns. Biometric signals of human can be effectivelymeasured with ECG and EDR. Experimental results show these easy-to-mount on CPR Equipments are sufficient for generatingreliable signal measurements. People will rely more and more on robotic assistants in the near future. Personal robots will leadthe way to the new era. CPR provides a stepping stone.

IROS2006 Video Digest 3