Item type |
SIG Technical Reports(1) |
公開日 |
2021-07-13 |
タイトル |
|
|
タイトル |
Analysis of Optimised Transformer Models in Image Captioning Tasks |
タイトル |
|
|
言語 |
en |
|
タイトル |
Analysis of Optimised Transformer Models in Image Captioning Tasks |
言語 |
|
|
言語 |
eng |
キーワード |
|
|
主題Scheme |
Other |
|
主題 |
深層学習 |
資源タイプ |
|
|
資源タイプ識別子 |
http://purl.org/coar/resource_type/c_18gh |
|
資源タイプ |
technical report |
著者所属 |
|
|
|
Hochschule fur Technik und Wirtschaft Berlin - University of Applied Sciences/Presently with Fujitsu Limited Japan |
著者所属 |
|
|
|
Fujitsu Limited Japan |
著者所属 |
|
|
|
Fujitsu Limited Japan |
著者所属 |
|
|
|
Fujitsu Limited Japan |
著者所属(英) |
|
|
|
en |
|
|
Hochschule fur Technik und Wirtschaft Berlin - University of Applied Sciences / Presently with Fujitsu Limited Japan |
著者所属(英) |
|
|
|
en |
|
|
Fujitsu Limited Japan |
著者所属(英) |
|
|
|
en |
|
|
Fujitsu Limited Japan |
著者所属(英) |
|
|
|
en |
|
|
Fujitsu Limited Japan |
著者名 |
Maximilian, Zimmermann
Thang, Dang
Tsuguchika, Tabaru
Atsushi, Ike
|
著者名(英) |
Maximilian, Zimmermann
Thang, Dang
Tsuguchika, Tabaru
Atsushi, Ike
|
論文抄録 |
|
|
内容記述タイプ |
Other |
|
内容記述 |
This research work is about using Transformer models, which are first introduced in the paper ”Attention is All You Need”, for a multimodal task, specifically image captioning. By treating it as an NLP translation task, different Transformer models are evaluated and optimised. Through the analysis of the data, model and distributed communication pipeline, bottlenecks are identified and performance increases in regards to accuracy and speed are shown across multiple accelerators. |
論文抄録(英) |
|
|
内容記述タイプ |
Other |
|
内容記述 |
This research work is about using Transformer models, which are first introduced in the paper ”Attention is All You Need”, for a multimodal task, specifically image captioning. By treating it as an NLP translation task, different Transformer models are evaluated and optimised. Through the analysis of the data, model and distributed communication pipeline, bottlenecks are identified and performance increases in regards to accuracy and speed are shown across multiple accelerators. |
書誌レコードID |
|
|
収録物識別子タイプ |
NCID |
|
収録物識別子 |
AN10463942 |
書誌情報 |
研究報告ハイパフォーマンスコンピューティング(HPC)
巻 2021-HPC-180,
号 11,
p. 1-6,
発行日 2021-07-13
|
ISSN |
|
|
収録物識別子タイプ |
ISSN |
|
収録物識別子 |
2188-8841 |
Notice |
|
|
|
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc. |
出版者 |
|
|
言語 |
ja |
|
出版者 |
情報処理学会 |