Altmetric

A Kronecker product accelerated efficient sparse Gaussian Process (E-SGP) for flow emulation

File Description SizeFormat 
1-s2.0-S0021999124007083-main.pdfPublished version4.13 MBAdobe PDFView/Open
Title: A Kronecker product accelerated efficient sparse Gaussian Process (E-SGP) for flow emulation
Authors: Duan, Y
Eaton, M
Bluck, M
Item Type: Journal Article
Abstract: In this paper, we introduce an efficient sparse Gaussian process (E-SGP) for the surrogate modelling of fluid mechanics. This novel Bayesian machine learning algorithm enables efficient model training using databases of different structures. It extends the approximated sparse GP algorithm by combining the concept of efficient GP (E-GP) and variational energy free sparse Gaussian processes (VEF-SGP). The developed E-SGP approach leverages the arbitrariness of inducing points and the monotonically increasing nature of the objective function with respect to the number of inducing points in VEF-SGP. By specifying the inducing points on the orthogonal grid/input subspace and using the Kronecker product, E-SGP significantly enhances computational efficiency without imposing constraints on the covariance matrix or increasing the number of parameters that need to be optimised during training. The E-SGP algorithm outperforms E-GP not only in terms of scalability but also in model quality, as evidenced by mean standardised logarithmic loss (MSLL). The computational complexity of E-GP grows cubically with an increasing structured training database, while E-SGP maintains computational efficiency as the model resolution (i.e., the number of inducing points) remains the same. The examples show that E-SGP produces more accurate predictions compared to E-GP when model resolutions are similar. In the application to a partially structured database, E-GP benefits from more training data but comes with higher computational demands, while E-SGP achieves a comparable level of accuracy but is more computationally efficient, making E-SGP a potentially preferable choice for fluid mechanics. Furthermore, E-SGP can produce more reasonable estimates of model uncertainty, whilst E-GP is more likely to produce over-confident predictions. In the application to partially structured databases, the performance of E-SGP is also compared to the structured Gaussian processes latent variable model (SGPLVM). In this case, E-SGP and SGPLVM both exhibit robust performance. However, E-SGP demonstrates a slight advantage over SGPLVM, particularly in terms of MSLL, indicating that E-SGP is better at producing trustworthy and accurate estimates of the uncertainty in the predictions. . While SGPLVM also improves computational efficiency by exploiting the Kronecker product and a small number of inducing points, E-SGP's orthogonal grid specification of inducing points further enhances its scalability and robustness, making it a superior choice for complex fluid mechanics problems with partially structured data.
Issue Date: 1-Jan-2025
Date of Acceptance: 26-Sep-2024
URI: http://hdl.handle.net/10044/1/115396
DOI: 10.1016/j.jcp.2024.113460
ISSN: 0021-9991
Publisher: Elsevier
Journal / Book Title: Journal of Computational Physics
Volume: 520
Copyright Statement: © 2024 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
Publication Status: Published
Article Number: 113460
Online Publication Date: 2024-09-27
Appears in Collections:Mechanical Engineering
Faculty of Engineering



This item is licensed under a Creative Commons License Creative Commons