Paper Abstract and Keywords |
Presentation |
2022-06-27 14:00
Transformer-Based Fully Trainable Model for Point Process with Past Sequence-Representative Vector Fumiya Nishizawa, Sujun Hong, Hirotaka Hachiya (Graduate School of System Engineering, Wakayama University) NC2022-1 IBISML2022-1 |
Abstract |
(in Japanese) |
(See Japanese page) |
(in English) |
Recently, a Transformer-based partially trainable point process has been proposed, where a feature vector is extracted from past event sequence to predict the future event. However, high dependencies of the feature on last event and
limitation of handmade designed hazard function would cause deterioration peformance. To overcome these problems, we
propose a Transformer-based fully trainable point process, where multiple trainable vectors are embedded into the past event
sequence and are transformed through an attention mechanism to realize adaptive and general approximation and prediction.
We show the effectiveness of our proposed method through experiments on two datasets. |
Keyword |
(in Japanese) |
(See Japanese page) |
(in English) |
point process / Transformer / seismic event / Hawkes process / / / / |
Reference Info. |
IEICE Tech. Rep., vol. 122, no. 90, IBISML2022-1, pp. 1-5, June 2022. |
Paper # |
IBISML2022-1 |
Date of Issue |
2022-06-20 (NC, IBISML) |
ISSN |
Online edition: ISSN 2432-6380 |
Copyright and reproduction |
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
Download PDF |
NC2022-1 IBISML2022-1 |
|