| Item type |
SIG Technical Reports(1) |
| 公開日 |
2022-06-20 |
| タイトル |
|
|
タイトル |
Transformer based model for Point Process with past sequence-representative vector |
| タイトル |
|
|
言語 |
en |
|
タイトル |
Transformer based model for Point Process with past sequence-representative vector |
| 言語 |
|
|
言語 |
eng |
| 資源タイプ |
|
|
資源タイプ識別子 |
http://purl.org/coar/resource_type/c_18gh |
|
資源タイプ |
technical report |
| 著者所属 |
|
|
|
Graduate School of System Engineering, Wakayama University |
| 著者所属 |
|
|
|
Graduate School of System Engineering, Wakayama University |
| 著者所属 |
|
|
|
Graduate School of System Engineering, Wakayama University |
| 著者所属(英) |
|
|
|
en |
|
|
Graduate School of System Engineering, Wakayama University |
| 著者所属(英) |
|
|
|
en |
|
|
Graduate School of System Engineering, Wakayama University |
| 著者所属(英) |
|
|
|
en |
|
|
Graduate School of System Engineering, Wakayama University |
| 著者名 |
Fumiya, Nishizawa
Sujun, Hong
Hirotaka, Hachiya
|
| 著者名(英) |
Fumiya, Nishizawa
Sujun, Hong
Hirotaka, Hachiya
|
| 論文抄録 |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Recently, a Transformer-based partially trainable point process has been proposed, where a feature vector is extracted from past event sequence to predict the future event. However, high dependencies of the feature on last event and limitation of handmade designed hazard function would cause deterioration peformance. To overcome these problems, we propose a Transformer-based fully trainable point process, where multiple trainable vectors are embedded into the past event sequence and are transformed through an attention mechanism to realize adaptive and general approximation and prediction.We show the effectiveness of our proposed method through experiments on two datasets. |
| 論文抄録(英) |
|
|
内容記述タイプ |
Other |
|
内容記述 |
Recently, a Transformer-based partially trainable point process has been proposed, where a feature vector is extracted from past event sequence to predict the future event. However, high dependencies of the feature on last event and limitation of handmade designed hazard function would cause deterioration peformance. To overcome these problems, we propose a Transformer-based fully trainable point process, where multiple trainable vectors are embedded into the past event sequence and are transformed through an attention mechanism to realize adaptive and general approximation and prediction.We show the effectiveness of our proposed method through experiments on two datasets. |
| 書誌レコードID |
|
|
収録物識別子タイプ |
NCID |
|
収録物識別子 |
AA12055912 |
| 書誌情報 |
研究報告バイオ情報学(BIO)
巻 2022-BIO-70,
号 2,
p. 1-5,
発行日 2022-06-20
|
| ISSN |
|
|
収録物識別子タイプ |
ISSN |
|
収録物識別子 |
2188-8590 |
| Notice |
|
|
|
SIG Technical Reports are nonrefereed and hence may later appear in any journals, conferences, symposia, etc. |
| 出版者 |
|
|
言語 |
ja |
|
出版者 |
情報処理学会 |