ログイン 新規登録
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. 研究報告
  2. マルチメディア通信と分散処理(DPS)
  3. 2021
  4. 2021-DPS-189

Difficulty of detecting overstated dataset size in Federated Learning

https://ipsj.ixsq.nii.ac.jp/records/214328
https://ipsj.ixsq.nii.ac.jp/records/214328
a913f552-6226-4cf5-8915-bcc28d3596aa
名前 / ファイル ライセンス アクション
IPSJ-DPS21189010.pdf IPSJ-DPS21189010.pdf (1.1 MB)
Copyright (c) 2021 by the Information Processing Society of Japan
オープンアクセス
Item type SIG Technical Reports(1)
公開日 2021-12-13
タイトル
タイトル Difficulty of detecting overstated dataset size in Federated Learning
タイトル
言語 en
タイトル Difficulty of detecting overstated dataset size in Federated Learning
言語
言語 eng
キーワード
主題Scheme Other
主題 機械学習・予測モデル
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_18gh
資源タイプ technical report
著者所属
The University of Tokyo
著者所属
Nara Institute of Science and Technology
著者所属
Nara Institute of Science and Technology
著者所属(英)
en
The University of Tokyo
著者所属(英)
en
Nara Institute of Science and Technology
著者所属(英)
en
Nara Institute of Science and Technology
著者名 Hideaki, Takahashi

× Hideaki, Takahashi

Hideaki, Takahashi

Search repository
Kohei, Ichikawa

× Kohei, Ichikawa

Kohei, Ichikawa

Search repository
Keichi, Takahashi

× Keichi, Takahashi

Keichi, Takahashi

Search repository
著者名(英) Hideaki, Takahashi

× Hideaki, Takahashi

en Hideaki, Takahashi

Search repository
Kohei, Ichikawa

× Kohei, Ichikawa

en Kohei, Ichikawa

Search repository
Keichi, Takahashi

× Keichi, Takahashi

en Keichi, Takahashi

Search repository
論文抄録
内容記述タイプ Other
内容記述 Federated learning is a distributed learning method in which multiple clients cooperate to train a model. Each client sends the gradient of its locally trained model to the server, and the server aggregates the received gradients to build a global model. Since federated learning requires many clients to train a high-performance model, researchers have designed incentive mechanisms that distribute rewards to clients to motivate their participation. While most incentive mechanisms distribute rewards according to the contribution of each client often defined by the number of data, little research has been done on the risk that clients try to claim more rewards by overstating the number of data. This paper proposes three possible methods to exaggerate the size of a local dataset: simple exaggeration of the reported number, modification of the batch size during training, and exaggeration of the dataset by Data Augmentation. Using a variety of models and datasets, we show the inadequacy of current anomaly detection methods in identifying such exaggerations.
論文抄録(英)
内容記述タイプ Other
内容記述 Federated learning is a distributed learning method in which multiple clients cooperate to train a model. Each client sends the gradient of its locally trained model to the server, and the server aggregates the received gradients to build a global model. Since federated learning requires many clients to train a high-performance model, researchers have designed incentive mechanisms that distribute rewards to clients to motivate their participation. While most incentive mechanisms distribute rewards according to the contribution of each client often defined by the number of data, little research has been done on the risk that clients try to claim more rewards by overstating the number of data. This paper proposes three possible methods to exaggerate the size of a local dataset: simple exaggeration of the reported number, modification of the batch size during training, and exaggeration of the dataset by Data Augmentation. Using a variety of models and datasets, we show the inadequacy of current anomaly detection methods in identifying such exaggerations.
書誌レコードID
収録物識別子タイプ NCID
収録物識別子 AN10116224
書誌情報 研究報告マルチメディア通信と分散処理(DPS)

巻 2021-DPS-189, 号 10, p. 1-6, 発行日 2021-12-13
ISSN
収録物識別子タイプ ISSN
収録物識別子 2188-8906
Notice
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc.
出版者
言語 ja
出版者 情報処理学会
戻る
0
views
See details
Views

Versions

Ver.1 2025-01-19 16:39:48.349466
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3