WEKO3
アイテム
Theoretical analysis of learning speed in gradient descent algorithm replacing derivative with constant
https://ipsj.ixsq.nii.ac.jp/records/87221
https://ipsj.ixsq.nii.ac.jp/records/87221e57f36d9-6eca-491c-8725-4d08b4d0d55e
名前 / ファイル | ライセンス | アクション |
---|---|---|
![]() |
Copyright (c) 2012 by the Information Processing Society of Japan
|
|
オープンアクセス |
Item type | SIG Technical Reports(1) | |||||||
---|---|---|---|---|---|---|---|---|
公開日 | 2012-11-29 | |||||||
タイトル | ||||||||
タイトル | Theoretical analysis of learning speed in gradient descent algorithm replacing derivative with constant | |||||||
タイトル | ||||||||
言語 | en | |||||||
タイトル | Theoretical analysis of learning speed in gradient descent algorithm replacing derivative with constant | |||||||
言語 | ||||||||
言語 | eng | |||||||
資源タイプ | ||||||||
資源タイプ識別子 | http://purl.org/coar/resource_type/c_18gh | |||||||
資源タイプ | technical report | |||||||
著者所属 | ||||||||
College of Industrial Technology, Nihon University | ||||||||
著者所属 | ||||||||
Center for Evolutionary Cognitive Sciences, The University of Tokyo/Brain Science Institute, RIKEN | ||||||||
著者所属(英) | ||||||||
en | ||||||||
College of Industrial Technology, Nihon University | ||||||||
著者所属(英) | ||||||||
en | ||||||||
Center for Evolutionary Cognitive Sciences, The University of Tokyo / Brain Science Institute, RIKEN | ||||||||
著者名 |
Kazuyuki, Hara
× Kazuyuki, Hara
|
|||||||
著者名(英) |
Kazuyuki, Hara
× Kazuyuki, Hara
|
|||||||
論文抄録 | ||||||||
内容記述タイプ | Other | |||||||
内容記述 | In on-line gradient descent learning, the local property of the derivative term of the output can slow convergence. Improving the derivative term, such as by using the natural gradient, has been proposed for speeding up the convergence. Beside this sophisticated method, we propose an algorithm that replace the derivative term with a constant in this paper and showed that this greatly increases convergence speed under some conditions. The proposed algorithm inspired by linear perceptron learning, and it can avoid locality of the derivative term. We derived the closed deterministic differential equations by using a statistical mechanics method and show validity of analytical solutions by comparing that of computer simulations. | |||||||
論文抄録(英) | ||||||||
内容記述タイプ | Other | |||||||
内容記述 | In on-line gradient descent learning, the local property of the derivative term of the output can slow convergence. Improving the derivative term, such as by using the natural gradient, has been proposed for speeding up the convergence. Beside this sophisticated method, we propose an algorithm that replace the derivative term with a constant in this paper and showed that this greatly increases convergence speed under some conditions. The proposed algorithm inspired by linear perceptron learning, and it can avoid locality of the derivative term. We derived the closed deterministic differential equations by using a statistical mechanics method and show validity of analytical solutions by comparing that of computer simulations. | |||||||
書誌レコードID | ||||||||
収録物識別子タイプ | NCID | |||||||
収録物識別子 | AA12055912 | |||||||
書誌情報 |
研究報告バイオ情報学(BIO) 巻 2012-BIO-32, 号 28, p. 1-6, 発行日 2012-11-29 |
|||||||
Notice | ||||||||
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc. | ||||||||
出版者 | ||||||||
言語 | ja | |||||||
出版者 | 情報処理学会 |