講演抄録/キーワード |
講演名 |
2022-06-27 14:00
Transformer-Based Fully Trainable Model for Point Process with Past Sequence-Representative Vector ○Fumiya Nishizawa・Sujun Hong・Hirotaka Hachiya(Graduate School of System Engineering, Wakayama University) NC2022-1 IBISML2022-1 |
抄録 |
(和) |
(まだ登録されていません) |
(英) |
Recently, a Transformer-based partially trainable point process has been proposed, where a feature vector is extracted from past event sequence to predict the future event. However, high dependencies of the feature on last event and
limitation of handmade designed hazard function would cause deterioration peformance. To overcome these problems, we
propose a Transformer-based fully trainable point process, where multiple trainable vectors are embedded into the past event
sequence and are transformed through an attention mechanism to realize adaptive and general approximation and prediction.
We show the effectiveness of our proposed method through experiments on two datasets. |
キーワード |
(和) |
点過程 / Transformer / 地震データ / Hawkes過程 / / / / |
(英) |
point process / Transformer / seismic event / Hawkes process / / / / |
文献情報 |
信学技報, vol. 122, no. 90, IBISML2022-1, pp. 1-5, 2022年6月. |
資料番号 |
IBISML2022-1 |
発行日 |
2022-06-20 (NC, IBISML) |
ISSN |
Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
NC2022-1 IBISML2022-1 |
|