ログイン 新規登録
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. 研究報告
  2. ハイパフォーマンスコンピューティング(HPC)
  3. 2024
  4. 2024-HPC-197

A survey of sparse structures in the multi-layer perceptron of large language models

https://ipsj.ixsq.nii.ac.jp/records/241718
https://ipsj.ixsq.nii.ac.jp/records/241718
07be72a6-c38b-456b-aed1-a5c94fdf2402
名前 / ファイル ライセンス アクション
IPSJ-HPC24197025.pdf IPSJ-HPC24197025.pdf (1.0 MB)
 2026年12月9日からダウンロード可能です。
Copyright (c) 2024 by the Information Processing Society of Japan
非会員:¥660, IPSJ:学会員:¥330, HPC:会員:¥0, DLIB:会員:¥0
Item type SIG Technical Reports(1)
公開日 2024-12-09
タイトル
タイトル A survey of sparse structures in the multi-layer perceptron of large language models
タイトル
言語 en
タイトル A survey of sparse structures in the multi-layer perceptron of large language models
言語
言語 eng
キーワード
主題Scheme Other
主題 省電力
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_18gh
資源タイプ technical report
著者所属
Fujitsu Ltd.
著者所属
University of Kyushu
著者所属
Fujitsu Ltd.
著者所属
Fujitsu Ltd.
著者所属(英)
en
Fujitsu Ltd.
著者所属(英)
en
University of Kyushu
著者所属(英)
en
Fujitsu Ltd.
著者所属(英)
en
Fujitsu Ltd.
著者名 Sameer, Deshmukh

× Sameer, Deshmukh

Sameer, Deshmukh

Search repository
Mingchuan, Lyu

× Mingchuan, Lyu

Mingchuan, Lyu

Search repository
Hiroki, Tokura

× Hiroki, Tokura

Hiroki, Tokura

Search repository
Takumi, Honda

× Takumi, Honda

Takumi, Honda

Search repository
著者名(英) Sameer, Deshmukh

× Sameer, Deshmukh

en Sameer, Deshmukh

Search repository
Mingchuan, Lyu

× Mingchuan, Lyu

en Mingchuan, Lyu

Search repository
Hiroki, Tokura

× Hiroki, Tokura

en Hiroki, Tokura

Search repository
Takumi, Honda

× Takumi, Honda

en Takumi, Honda

Search repository
論文抄録
内容記述タイプ Other
内容記述 Large language models using the transformer architecture require massive computational resources for training to acceptable levels of accuracy. Recent advances have shown that the MLP layers within such models can be pruned to up to 90% sparsity to reduce the computational requirement of training and inference. However, achieving high performance for the sparse matrix multiplication remains a challenge on GPUs. Several approaches have been suggested for improving the performance of sparse matrix multiplication using structured sparsity. In this paper, we first survey and benchmark some of the sparsity structures that can be applied to dense matrices, and then examine the training loss curves of a 162M Mistral model using various structures of sparsity. Our results show promising future directions for research in improving the training time of transformers using sparsity.
論文抄録(英)
内容記述タイプ Other
内容記述 Large language models using the transformer architecture require massive computational resources for training to acceptable levels of accuracy. Recent advances have shown that the MLP layers within such models can be pruned to up to 90% sparsity to reduce the computational requirement of training and inference. However, achieving high performance for the sparse matrix multiplication remains a challenge on GPUs. Several approaches have been suggested for improving the performance of sparse matrix multiplication using structured sparsity. In this paper, we first survey and benchmark some of the sparsity structures that can be applied to dense matrices, and then examine the training loss curves of a 162M Mistral model using various structures of sparsity. Our results show promising future directions for research in improving the training time of transformers using sparsity.
書誌レコードID
収録物識別子タイプ NCID
収録物識別子 AN10463942
書誌情報 研究報告ハイパフォーマンスコンピューティング(HPC)

巻 2024-HPC-197, 号 25, p. 1-6, 発行日 2024-12-09
ISSN
収録物識別子タイプ ISSN
収録物識別子 2188-8841
Notice
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc.
出版者
言語 ja
出版者 情報処理学会
戻る
0
views
See details
Views

Versions

Ver.1 2025-01-19 07:34:09.161172
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3