WEKO3
アイテム
Embodiment of Guidance Robot Encourages Conversation among Visitors
https://ipsj.ixsq.nii.ac.jp/records/178924
https://ipsj.ixsq.nii.ac.jp/records/178924e9cad269-61de-4749-984d-0b2597f65898
名前 / ファイル | ライセンス | アクション |
---|---|---|
![]() |
Copyright (c) 2017 by the Information Processing Society of Japan
|
|
オープンアクセス |
Item type | Journal(1) | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
公開日 | 2017-05-15 | |||||||||||
タイトル | ||||||||||||
タイトル | Embodiment of Guidance Robot Encourages Conversation among Visitors | |||||||||||
タイトル | ||||||||||||
言語 | en | |||||||||||
タイトル | Embodiment of Guidance Robot Encourages Conversation among Visitors | |||||||||||
言語 | ||||||||||||
言語 | eng | |||||||||||
キーワード | ||||||||||||
主題Scheme | Other | |||||||||||
主題 | [特集:インタラクションの理解および基盤・応用技術] guidance robot, content distribution, embodiment, exhibition, conversation | |||||||||||
資源タイプ | ||||||||||||
資源タイプ識別子 | http://purl.org/coar/resource_type/c_6501 | |||||||||||
資源タイプ | journal article | |||||||||||
著者所属 | ||||||||||||
Ritsumeikan University | ||||||||||||
著者所属 | ||||||||||||
Future University Hakodate | ||||||||||||
著者所属 | ||||||||||||
Future University Hakodate | ||||||||||||
著者所属(英) | ||||||||||||
en | ||||||||||||
Ritsumeikan University | ||||||||||||
著者所属(英) | ||||||||||||
en | ||||||||||||
Future University Hakodate | ||||||||||||
著者所属(英) | ||||||||||||
en | ||||||||||||
Future University Hakodate | ||||||||||||
著者名 |
Kohei, Matsumura
× Kohei, Matsumura
× Yasuyuki, Sumi
× Takumi, Gompei
|
|||||||||||
著者名(英) |
Kohei, Matsumura
× Kohei, Matsumura
× Yasuyuki, Sumi
× Takumi, Gompei
|
|||||||||||
論文抄録 | ||||||||||||
内容記述タイプ | Other | |||||||||||
内容記述 | In this paper, we propose a novel way to encourage visitors to share their experiences and interests in exhibition spaces. Visitors may have experiences in an exhibition and become aware of meanings of the exhibits and/or relationship among them. We believe that sharing the experiences of visitors will enhance the exhibition experience for subsequent visitors because shared experiences may include fascinating topics. To acquire experiences of the visitors, we used “PhotoChat,” which is an in-house photo communication software. PhotoChat is capable of communicating with others by taking photographs and adding annotations to each photograph. It also records the locations for coordination between the photographs and the statistical information contained in the annotations. Since PhotoChat is designed for realtime communication, in this study, we introduce a robot that inhabits the exhibition space. The robot is always on the PhotoChat and acquires all data on PhotoChat. The robot, thus, is capable to know what a visitor communicate with others on PhotoChat and to share them with subsequent visitors. The robot can also use bodily actions to express instructions to the visitors. We developed a system that integrates PhotoChat into a robot. We also implemented robot behavior (i.e., bodily actions and motions) that includes recommendations for photographs taken by others. That is, the robot communicates with human using both PhotoChat and its body. We held workshops to perform data collection and manually classified the data into three content categories. We then performed experiments using the developed system to distribute the classified content. The results showed that the robot's physical behaviors encouraged conversations between the visitors based on provided topics. ------------------------------ This is a preprint of an article intended for publication Journal of Information Processing(JIP). This preprint should not be cited. This article should be cited as: Journal of Information Processing Vol.25(2017) (online) DOI http://dx.doi.org/10.2197/ipsjjip.25.352 ------------------------------ |
|||||||||||
論文抄録(英) | ||||||||||||
内容記述タイプ | Other | |||||||||||
内容記述 | In this paper, we propose a novel way to encourage visitors to share their experiences and interests in exhibition spaces. Visitors may have experiences in an exhibition and become aware of meanings of the exhibits and/or relationship among them. We believe that sharing the experiences of visitors will enhance the exhibition experience for subsequent visitors because shared experiences may include fascinating topics. To acquire experiences of the visitors, we used “PhotoChat,” which is an in-house photo communication software. PhotoChat is capable of communicating with others by taking photographs and adding annotations to each photograph. It also records the locations for coordination between the photographs and the statistical information contained in the annotations. Since PhotoChat is designed for realtime communication, in this study, we introduce a robot that inhabits the exhibition space. The robot is always on the PhotoChat and acquires all data on PhotoChat. The robot, thus, is capable to know what a visitor communicate with others on PhotoChat and to share them with subsequent visitors. The robot can also use bodily actions to express instructions to the visitors. We developed a system that integrates PhotoChat into a robot. We also implemented robot behavior (i.e., bodily actions and motions) that includes recommendations for photographs taken by others. That is, the robot communicates with human using both PhotoChat and its body. We held workshops to perform data collection and manually classified the data into three content categories. We then performed experiments using the developed system to distribute the classified content. The results showed that the robot's physical behaviors encouraged conversations between the visitors based on provided topics. ------------------------------ This is a preprint of an article intended for publication Journal of Information Processing(JIP). This preprint should not be cited. This article should be cited as: Journal of Information Processing Vol.25(2017) (online) DOI http://dx.doi.org/10.2197/ipsjjip.25.352 ------------------------------ |
|||||||||||
書誌レコードID | ||||||||||||
収録物識別子タイプ | NCID | |||||||||||
収録物識別子 | AN00116647 | |||||||||||
書誌情報 |
情報処理学会論文誌 巻 58, 号 5, 発行日 2017-05-15 |
|||||||||||
ISSN | ||||||||||||
収録物識別子タイプ | ISSN | |||||||||||
収録物識別子 | 1882-7764 |