[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Modelling event sequence data by type-wise neural point process

  • Published:
Data Mining and Knowledge Discovery Aims and scope Submit manuscript

Abstract

Event sequence data widely exists in real life, where each event is typically represented as a tuple, event type and occurrence time. Recently, neural point process (NPP), a probabilistic model that learns the next event distribution with events history given, has gained a lot of attention for event sequence modelling. Existing NPP models use one single vector to encode the whole events history. However, each type of event has its own historical events of concern, which should have led to a different encoding for events history. To this end, we propose Type-wise Neural Point Process (TNPP), with each type of event having a history vector to encode the historical events of its own interest. Type-wise encoding further leads to the realization of type-wise decoding, which together makes a more effective neural point process. Experimental results on six datasets show that TNPP outperforms existing models on the event type prediction task under both extrapolation and interpolation setting. Moreover, the results in terms of scalability and interpretability show that TNPP scales well to datasets with many event types and can provide high-quality event dependencies for interpretation. The code and data can be found at https://github.com/lbq8942/TNPP.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. https://x-datainitiative.github.io/tick/modules/hawkes.html.

References

  • Bacry E, Mastromatteo I, Muzy J-F (2015) Hawkes processes in finance. Mark Microstruct Liq 1(01):1550005

    Article  Google Scholar 

  • Chen RT, Rubanova Y, Bettencourt J, Duvenaud DK (2018) Neural ordinary differential equations. Adv Neural Inf Process Syst 31

  • Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. In: NeurIPS 2014 workshop on deep learning, December 2014

  • Daley DJ, Vere-Jones D (2003) An introduction to the theory of point processes: volume I: elementary theory and methods. Springer, New York

    Google Scholar 

  • Daley DJ, Vere-Jones D (2007) An introduction to the theory of point processes: volume II: general theory and structure. Springer, New York

    Google Scholar 

  • Dash S, She X, Mukhopadhyay S (2022) Learning point processes using recurrent graph network. In: 2022 international joint conference on neural networks (IJCNN). IEEE, pp 1–8

  • Du N, Dai H, Trivedi R, Upadhyay U, Gomez-Rodriguez M, Song L (2016) Recurrent marked temporal point processes: embedding event history to vector. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 1555–1564

  • Enguehard J, Busbridge D, Bozson A, Woodcock C, Hammerla N (2020) Neural temporal point processes for modelling electronic health records. In: Machine learning for health. PMLR, pp 85–113

  • Farajtabar M, Wang Y, Gomez Rodriguez M, Li S, Zha H, Song L (2015) Coevolve: a joint point process model for information diffusion and network co-evolution. Adv Neural Inf Process Syst 28:1–49

    Google Scholar 

  • Guo R, Li J, Liu H (2018) Initiator: noise-contrastive estimation for marked temporal point process. In: International joint conference on artificial intelligence

  • Hawkes AG (1971) Point spectra of some mutually exciting point processes. J R Stat Soc Ser B (Methodol) 33(3):438–443

    Article  MathSciNet  Google Scholar 

  • Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  • Kingman JFC (1992) Poisson processes, vol 3. Clarendon Press, London

    Book  Google Scholar 

  • Kumar S, Zhang X, Leskovec J (2019) Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining

  • Lin H, Tan C, Wu L, Gao Z, Li S et al (2021) An empirical study: extensive deep temporal point process. arXiv:2110.09823

  • Liu S, Li L (2021) Learning general temporal point processes based on dynamic weight generation. Appl Intell 52:3678–3690

    Article  Google Scholar 

  • Li S, Xiao S, Zhu S, Du N, Xie Y, Song L (2018) Learning temporal point processes via reinforcement learning. Adv Neural Inf Process Syst 31

  • Maaten L, Hinton GE (2008) Visualizing data using t-SNE. J Mach Learn Res 9:2579–2605

    Google Scholar 

  • Mei H, Eisner JM (2017) The neural Hawkes process: a neurally self-modulating multivariate point process. Adv Neural Inf Process Syst 30

  • Omi T, Aihara K et al (2019) Fully neural network based model for general temporal point processes. Adv Neural Inf Process Syst 32

  • Paul MJ, Dredze M (2011) You are what you tweet: analyzing twitter for public health. In: Proceedings of the international AAAI conference on web and social media

  • Rindt D, Hu R, Steinsaltz D, Sejdinovic D (2022) Survival regression with proper scoring rules and monotonic neural networks. In: International conference on artificial intelligence and statistics. PMLR, pp 1190–1205

  • Shchur O, BiloÅ¡ M, Günnemann S (2020) Intensity-free learning of temporal point processes. In: International conference on learning representations

  • Shchur O, Türkmen AC, Januschowski T, Günnemann S (2021) Neural temporal point processes: a review. arXiv arXiv:2104.03528

  • Soen A, Mathews A, Grixti-Cheng D, Xie L (2021) Unipoint: universally approximating point processes intensities. In: Proceedings of the AAAI conference on artificial intelligence, vol 35, pp 9685–9694

  • Song Z, Liu J, Yang J, Zhang L (2023) Linear normalization attention neural Hawkes process. Neural Comput Appl 35(1):1025–1039

    Article  Google Scholar 

  • Trivedi RS, Farajtabar M, Biswal P, Zha H (2019) Dyrep: learning representations over dynamic graphs. In: International conference on learning representations

  • Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. Adv Neural Inf Process Syst 30

  • Veliçković P, Cucurull G, Casanova A, Romero A, Lió P, Bengio Y (2018) Graph attention networks. In: International conference on learning representations

  • Waghmare G, Debnath A, Asthana S, Malhotra A (2022) Modeling inter-dependence between time and mark in multivariate temporal point processes. In: Proceedings of the 31st ACM international conference on information & knowledge management, pp 1986–1995

  • Yan J (2019) Recent advance in temporal point process: from machine learning perspective. SJTU technical report

  • Yang C, Mei H, Eisner J (2022) Transformer embeddings of irregularly spaced events and their participants. In: Proceedings of the tenth international conference on learning representations (ICLR)

  • Yan J, Liu X, Shi L, Li C, Zha H (2018) Improving maximum likelihood estimation of temporal point process via discriminative and adversarial learning. In: International joint conference on artificial intelligence

  • Zhang Q, Lipani A, Kirnap O, Yilmaz E (2020) Self-attentive Hawkes process. In: International conference on machine learning. PMLR, pp 11183–11193

  • Zhang Y, Yan J, Zhang X, Zhou J, Yang X (2022) Learning mixture of neural temporal point processes for multi-dimensional event sequence clustering. In: International joint conference on artificial intelligence

  • Zhou W-T, Kang Z, Tian L, Su Y (2023) Intensity-free convolutional temporal point process: incorporating local and global event contexts. Inf Sci 64:119318

    Article  Google Scholar 

  • Zhu S, Zhang M, Ding R, Xie Y (2020) Deep Fourier kernel for self-attentive point processes. In: International conference on artificial intelligence and statistics

  • Zuo S, Jiang H, Li Z, Zhao T, Zha H (2020) Transformer Hawkes process. In: International conference on machine learning. PMLR, pp 11692–11702

Download references

Acknowledgements

The authors would like to thank Prof. Songmao Zhang from Academy of Mathematics and Systems Science (CAS) for helpful discussions on the ideas and experiments at the initial stage of this work. We also thank the editors and reviewers for their valuable feedback that improved this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bingqing Liu.

Additional information

Responsible editor: Eyke Hüllermeier.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, B. Modelling event sequence data by type-wise neural point process. Data Min Knowl Disc 38, 3449–3472 (2024). https://doi.org/10.1007/s10618-024-01047-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10618-024-01047-6

Keywords

Navigation