Please use this identifier to cite or link to this item:
Full metadata record
|dc.identifier.citation||Cryptographic Hardware and Embedded Systems - CHES Sep 2003||en|
|dc.description.abstract||Many examples exist of multivariate time series where dependencies between variables change over time. If these changing dependencies are not taken into account, any model that is learnt from the data will average over the different dependency structures. Paradigms that try to explain underlying processes and observed events in multivariate time series must explicitly model these changes in order to allow non-experts to analyse and understand such data. In this paper we have developed a method for generating explanations in multivariate time series that takes into account changing dependency structure. We make use of a dynamic Bayesian network model with hidden nodes. We introduce a representation and search technique for learning such models from data and test it on synthetic time series and real-world data from an oil refinery, both of which contain changing underlying structure. We compare our method to an existing EM-based method for learning structure. Results are very promising for our method and we include sample explanations, generated from models learnt from the refinery dataset.||en|
|dc.title||Learning dynamic Bayesian networks from multivariate time series with changing dependencies||en|
|Appears in Collections:||Computer Science|
Dept of Computer Science Research Papers
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.