ログイン 新規登録
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. シンポジウム
  2. シンポジウムシリーズ
  3. Asia Pacific Conference on Robot IoT System Development and Platform (APRIS)
  4. 2021

Weight Exchange in Decentralized Distributed Machine Learning for Resource-Constrained IoT Edges

https://ipsj.ixsq.nii.ac.jp/records/216195
https://ipsj.ixsq.nii.ac.jp/records/216195
61c545e7-9b5d-4bfb-9209-c84d4bcfdbee
名前 / ファイル ライセンス アクション
IPSJ-APRIS2021019.pdf IPSJ-APRIS2021019.pdf (1.2 MB)
Copyright (c) 2022 by the Information Processing Society of Japan
オープンアクセス
Item type Symposium(1)
公開日 2022-01-28
タイトル
タイトル Weight Exchange in Decentralized Distributed Machine Learning for Resource-Constrained IoT Edges
タイトル
言語 en
タイトル Weight Exchange in Decentralized Distributed Machine Learning for Resource-Constrained IoT Edges
言語
言語 eng
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_5794
資源タイプ conference paper
著者所属
Tokyo Institute of Technology
著者所属
Tokyo Institute of Technology
著者所属(英)
en
Tokyo Institute of Technology
著者所属(英)
en
Tokyo Institute of Technology
著者名 Naoya, Yokota

× Naoya, Yokota

Naoya, Yokota

Search repository
Yuko, Hara-Azumi

× Yuko, Hara-Azumi

Yuko, Hara-Azumi

Search repository
著者名(英) Naoya, Yokota

× Naoya, Yokota

en Naoya, Yokota

Search repository
Yuko, Hara-Azumi

× Yuko, Hara-Azumi

en Yuko, Hara-Azumi

Search repository
論文抄録
内容記述タイプ Other
内容記述 Although a Gossip Stochastic Gradient Descent (SGD) algorithm is known to be suitable for decentralized distributed machine learning, it has a non-convergence problem for heterogeneous datasets between multiple devices. In this paper, we propose a Gossip Swap SGD to address this problem by employing a weight swapping method between devices. Our evaluation demonstrated that our proposed method successfully improves higher accuracy without increasing computation load than the original Gossip SGD.
論文抄録(英)
内容記述タイプ Other
内容記述 Although a Gossip Stochastic Gradient Descent (SGD) algorithm is known to be suitable for decentralized distributed machine learning, it has a non-convergence problem for heterogeneous datasets between multiple devices. In this paper, we propose a Gossip Swap SGD to address this problem by employing a weight swapping method between devices. Our evaluation demonstrated that our proposed method successfully improves higher accuracy without increasing computation load than the original Gossip SGD.
書誌情報 Proceedings of Asia Pacific Conference on Robot IoT System Development and Platform

巻 2021, p. 94-95, 発行日 2022-01-28
出版者
言語 ja
出版者 情報処理学会
戻る
0
views
See details
Views

Versions

Ver.1 2025-01-19 15:53:46.720831
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3