Item type |
SIG Technical Reports(1) |
公開日 |
2021-05-25 |
タイトル |
|
|
タイトル |
スマートフォンのみを用いた周囲環境への視線入力インタフェースの検討 |
タイトル |
|
|
言語 |
en |
|
タイトル |
A Gaze Input Interface to Surrounding Environments Using a Single Smartphone |
言語 |
|
|
言語 |
jpn |
キーワード |
|
|
主題Scheme |
Other |
|
主題 |
入力・デバイス |
資源タイプ |
|
|
資源タイプ識別子 |
http://purl.org/coar/resource_type/c_18gh |
|
資源タイプ |
technical report |
著者所属 |
|
|
|
東北大学電気通信研究所 |
著者所属 |
|
|
|
東北大学電気通信研究所 |
著者所属 |
|
|
|
東北大学電気通信研究所 |
著者所属 |
|
|
|
東北大学電気通信研究所 |
著者所属(英) |
|
|
|
en |
|
|
Research Institute of Electrical Communication, Tohoku University |
著者所属(英) |
|
|
|
en |
|
|
Research Institute of Electrical Communication, Tohoku University |
著者所属(英) |
|
|
|
en |
|
|
Research Institute of Electrical Communication, Tohoku University |
著者所属(英) |
|
|
|
en |
|
|
Research Institute of Electrical Communication, Tohoku University |
著者名 |
永井, 崇大
藤田, 和之
高嶋, 和毅
北村, 喜文
|
著者名(英) |
Takahiro, Nagai
Kazuyuki, Fujita
Kazuki, Takashima
Yoshifumi, Kitamura
|
論文抄録 |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Gaze is a useful input modality for estimating a user’s region of interest and pointing to a distant target, but the challenge is that it usually requires the installation or wearing of additional specialized devices. In this work, we propose a novel user interface that enables gaze input to the user’s surrounding environment using only a widely-used smartphone. By simultaneously using both front and rear cameras and a depth sensor on the smartphone, it can track the user’s head orientation while recognizing its own 3D position in a known 3D map. This allows the system to estimate the user’s 3D head-gaze direction to the surrounding environment. We conducted an early performance test to evaluate the accuracy of head-gaze estimation using our interface. Based on these results, we estimated that the required target size for avoiding erroneous input is 1.64 m × 0.94 m. Finally we discussed the interactions in which the proposal is effective. |
論文抄録(英) |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Gaze is a useful input modality for estimating a user’s region of interest and pointing to a distant target, but the challenge is that it usually requires the installation or wearing of additional specialized devices. In this work, we propose a novel user interface that enables gaze input to the user’s surrounding environment using only a widely-used smartphone. By simultaneously using both front and rear cameras and a depth sensor on the smartphone, it can track the user’s head orientation while recognizing its own 3D position in a known 3D map. This allows the system to estimate the user’s 3D head-gaze direction to the surrounding environment. We conducted an early performance test to evaluate the accuracy of head-gaze estimation using our interface. Based on these results, we estimated that the required target size for avoiding erroneous input is 1.64 m × 0.94 m. Finally we discussed the interactions in which the proposal is effective. |
書誌レコードID |
|
|
収録物識別子タイプ |
NCID |
|
収録物識別子 |
AA1221543X |
書誌情報 |
研究報告ヒューマンコンピュータインタラクション(HCI)
巻 2021-HCI-193,
号 10,
p. 1-4,
発行日 2021-05-25
|
ISSN |
|
|
収録物識別子タイプ |
ISSN |
|
収録物識別子 |
2188-8760 |
Notice |
|
|
|
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc. |
出版者 |
|
|
言語 |
ja |
|
出版者 |
情報処理学会 |