Please use this identifier to cite or link to this item: http://buratest.brunel.ac.uk/handle/2438/10911
Title: A spatio-temporal Bayesian network approach for revealing functional ecological networks in fisheries
Authors: Duplisea, D
Kenny, A
Tucker, A
Keywords: Ecological networks;Bayesian networks;Spatial node
Issue Date: 2014
Publisher: Springer International Publishing
Citation: Advances in Intelligent Data Analysis XIII, Lecture Notes in Computer Science, 8819: 298-308, (2014)
Abstract: Ecosystems consist of complex dynamic interactions among species and the environment, the understanding of which has implications for predicting the environmental response to changes in climate and biodiversity. Machine learning techniques can allow such complex, spatially varying interactions to be recovered from collected field data. In this study, we apply structure learning techniques to identify functional relationships between trophic groups of species that vary across space and time. Specifically, Bayesian networks are created on a window of data for each of the 20 geographically different and temporally varied sub-regions within an oceanic area. In addition, we explored the spatial and temporal variation of pre-defined functions (like predation, competition) that are generalisable by experts’ knowledge. We were able to discover meaningful ecological networks that were more precisely spatially-specific rather than temporally, as previously suggested for this region. To validate the discovered networks, we predict the biomass of the trophic groups by using dynamic Bayesian networks, and correcting for spatial autocorrelation by including a spatial node in our models.
URI: http://link.springer.com/chapter/10.1007%2F978-3-319-12571-8_26
http://bura.brunel.ac.uk/handle/2438/10911
DOI: http://dx.doi.org/10.1007/978-3-319-12571-8_26
ISSN: 0302-9743
Appears in Collections:Dept of Life Sciences Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf947.57 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.