ログイン 新規登録
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. 研究報告
  2. ハイパフォーマンスコンピューティング(HPC)
  3. 2024
  4. 2024-HPC-195

Distributed K-FAC Over Unstable Networks (unreferred)

https://ipsj.ixsq.nii.ac.jp/records/237574
https://ipsj.ixsq.nii.ac.jp/records/237574
7303a3f6-f1f5-4db0-8a8b-7122d1cefba8
名前 / ファイル ライセンス アクション
IPSJ-HPC24195013.pdf IPSJ-HPC24195013.pdf (1.7 MB)
 2026年8月1日からダウンロード可能です。
Copyright (c) 2024 by the Information Processing Society of Japan
非会員:¥660, IPSJ:学会員:¥330, HPC:会員:¥0, DLIB:会員:¥0
Item type SIG Technical Reports(1)
公開日 2024-08-01
タイトル
タイトル Distributed K-FAC Over Unstable Networks (unreferred)
タイトル
言語 en
タイトル Distributed K-FAC Over Unstable Networks (unreferred)
言語
言語 eng
キーワード
主題Scheme Other
主題 深層学習
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_18gh
資源タイプ technical report
著者所属
Graduate School of Science and Technology, University of Tsukuba
著者所属
Center for Computational Sciences, University of Tsukuba
著者所属(英)
en
Graduate School of Science and Technology, University of Tsukuba
著者所属(英)
en
Center for Computational Sciences, University of Tsukuba
著者名 Mingzhe, Yu

× Mingzhe, Yu

Mingzhe, Yu

Search repository
Osamu, Tatebe

× Osamu, Tatebe

Osamu, Tatebe

Search repository
著者名(英) Mingzhe, Yu

× Mingzhe, Yu

en Mingzhe, Yu

Search repository
Osamu, Tatebe

× Osamu, Tatebe

en Osamu, Tatebe

Search repository
論文抄録
内容記述タイプ Other
内容記述 This study introduces a novel fault-tolerance approach for distributed training of deep learning models using the K-FAC optimizer under unstable network conditions. Inspired by the Federated Averaging algorithm, our method departs from traditional Distributed Data Parallel frameworks by periodically averaging weights among online nodes. We demonstrate that this approach significantly mitigates the impact of network instability on model training, effectively maintaining model accuracy even under high offline probabilities. The experimental results reveal that offline probability greatly affects test accuracy, while the maximum duration of offline iterations and the number of concurrently offline nodes exert lesser impacts. Our findings suggest that the block-diagonal approximation of the Fisher Information Matrix (FIM) used in K-FAC remains effective for guiding gradient descent, even with heterogeneous and outdated information. The study lays a groundwork for applying these insights to asynchronous training and federated learning in similar conditions.
論文抄録(英)
内容記述タイプ Other
内容記述 This study introduces a novel fault-tolerance approach for distributed training of deep learning models using the K-FAC optimizer under unstable network conditions. Inspired by the Federated Averaging algorithm, our method departs from traditional Distributed Data Parallel frameworks by periodically averaging weights among online nodes. We demonstrate that this approach significantly mitigates the impact of network instability on model training, effectively maintaining model accuracy even under high offline probabilities. The experimental results reveal that offline probability greatly affects test accuracy, while the maximum duration of offline iterations and the number of concurrently offline nodes exert lesser impacts. Our findings suggest that the block-diagonal approximation of the Fisher Information Matrix (FIM) used in K-FAC remains effective for guiding gradient descent, even with heterogeneous and outdated information. The study lays a groundwork for applying these insights to asynchronous training and federated learning in similar conditions.
書誌レコードID
収録物識別子タイプ NCID
収録物識別子 AN10463942
書誌情報 研究報告ハイパフォーマンスコンピューティング(HPC)

巻 2024-HPC-195, 号 13, p. 1-7, 発行日 2024-08-01
ISSN
収録物識別子タイプ ISSN
収録物識別子 2188-8841
Notice
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc.
出版者
言語 ja
出版者 情報処理学会
戻る
0
views
See details
Views

Versions

Ver.1 2025-01-19 08:50:23.705904
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3