Item type |
Symposium(1) |
公開日 |
2022-01-28 |
タイトル |
|
|
タイトル |
Weight Exchange in Decentralized Distributed Machine Learning for Resource-Constrained IoT Edges |
タイトル |
|
|
言語 |
en |
|
タイトル |
Weight Exchange in Decentralized Distributed Machine Learning for Resource-Constrained IoT Edges |
言語 |
|
|
言語 |
eng |
資源タイプ |
|
|
資源タイプ識別子 |
http://purl.org/coar/resource_type/c_5794 |
|
資源タイプ |
conference paper |
著者所属 |
|
|
|
Tokyo Institute of Technology |
著者所属 |
|
|
|
Tokyo Institute of Technology |
著者所属(英) |
|
|
|
en |
|
|
Tokyo Institute of Technology |
著者所属(英) |
|
|
|
en |
|
|
Tokyo Institute of Technology |
著者名 |
Naoya, Yokota
Yuko, Hara-Azumi
|
著者名(英) |
Naoya, Yokota
Yuko, Hara-Azumi
|
論文抄録 |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Although a Gossip Stochastic Gradient Descent (SGD) algorithm is known to be suitable for decentralized distributed machine learning, it has a non-convergence problem for heterogeneous datasets between multiple devices. In this paper, we propose a Gossip Swap SGD to address this problem by employing a weight swapping method between devices. Our evaluation demonstrated that our proposed method successfully improves higher accuracy without increasing computation load than the original Gossip SGD. |
論文抄録(英) |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Although a Gossip Stochastic Gradient Descent (SGD) algorithm is known to be suitable for decentralized distributed machine learning, it has a non-convergence problem for heterogeneous datasets between multiple devices. In this paper, we propose a Gossip Swap SGD to address this problem by employing a weight swapping method between devices. Our evaluation demonstrated that our proposed method successfully improves higher accuracy without increasing computation load than the original Gossip SGD. |
書誌情報 |
Proceedings of Asia Pacific Conference on Robot IoT System Development and Platform
巻 2021,
p. 94-95,
発行日 2022-01-28
|
出版者 |
|
|
言語 |
ja |
|
出版者 |
情報処理学会 |