Upload
deb
View
213
Download
0
Embed Size (px)
Citation preview
Video ID: Nv060Title: Person Following Robot ApriAttendaTMAuthors: Takashi Yoshimi, Manabu Nishiyama, Takafumi Sonoura, Hideichi Nakamoto, Seiji Tokura, Hirokazu Sato, Fumio
Ozaki, Nobuto Matsuhira, Hiroshi Mizoguchi
AbstractWe have developed the person following robot "ApriAttenda(TM)". This robot can accompany a person using vision basedtarget detection and avoid obstacles with ultrasonic sensors. The robot first identifies an individual with its image processingsystem by detecting a person's region and recognizing the registered color and texture of his/her clothes. Our newly developedalgorithm allows the robot to extract a particular individual from a cluttered background, and to find and reconnect with theperson if it loses visual contact. Tracking people with vision was realized by systematizing visual and motion control with arobust algorithm that utilizes various characteristics of the image data. ApriAttenda(TM) has been exhibited at Aichi EXPO2005, and its robust functions and smooth person following capability were successfully demonstrated.
Video ID: O v013Title: Grounded Situation odels: here ords and Percepts eet
Authors: Nikolaos Mavridis, Deb Roy
AbstractThis video illustrates some of the bahavioral capabilities achieved using an implementation of the proposed GSM architecture onthe conversational robot Ripley, as introduced in the video, and as further described in the accompanying paper. The situationmodel acts as a "theatrical stage" in the robot's mind, filled in with present, past or imagined situations. A special tripley-layereddesign enables bidirectional translation between the sensory data and linguistic descriptions.In the video, you will see the robot:1) Answering questions and serving motor commands, and verbalising uncertainty: What color are the objects?2) "Imagining" objects when informed about their existence through language, talking about them without having seen them.Later, you will see him matching his sensory expectations with existing objects, and integrating sensory-derived informationabout the objects with language-derived. Imagine a blue object on the left!3) Remembering past events, resolving temporal referents, and answering questions about the past. How big was the blue objectwhen your head started moving?More videos of the system at author's site.
Video ID: Oiv017Title: A Study on the Development of Ubiquitous CellPhone Robot
Authors: Seungwoo Kim, Dongik Oh, Dongwook Kim, Yongrae Jung, Jaeil Choe, Jungwook Choi
AbstractIn this video clip, the development of ubiquitous CPR(Cellular Phone Robot), which adds personal robotic features such asmobility and emotion to CP(Cellular Phone), is presented. We described a prototype implementation of CPR and demonstratedits successful navigational ability. CPR movement are controlled by the data acquired from velocity sensors and the coordinatesfrom RFID tags on the floor. Emotion modules of the CPR Interaction system stimulate human touch-sense through vibrations,olfactory-sense through perfumes, and visual-sense through motion patterns. Biometric signals of human can be effectivelymeasured with ECG and EDR. Experimental results show these easy-to-mount on CPR Equipments are sufficient for generatingreliable signal measurements. People will rely more and more on robotic assistants in the near future. Personal robots will leadthe way to the new era. CPR provides a stepping stone.
IROS2006 Video Digest 3