Back to Projects

FastVPINNs: Tensor-Driven Acceleration of VPINNs for Complex Geometries

PrePrint Available. (Under Review in SIAM-SISC)

Github arXiv

Project Abstract

Variational Physics-Informed Neural Networks (VPINNs) utilize a variational loss function to solve partial differential equations, mirroring Finite Element Analysis techniques. Traditional hp-VPINNs, while effective for high-frequency problems, are computationally intensive and scale poorly with increasing element counts, limiting their use in complex geometries. This work introduces FastVPINNs, a tensor-based advancement that significantly reduces computational overhead and improves scalability. Using optimized tensor operations, FastVPINNs achieve a 100-fold reduction in the median training time per epoch compared to traditional hp-VPINNs. With proper choice of hyperparameters, FastVPINNs surpass conventional PINNs in both speed and accuracy, especially in problems with highfrequency solutions. Demonstrated effectiveness in solving inverse problems on complex domains underscores FastVPINNs’ potential for widespread application in scientific and engineering challenges, opening new avenues for practical implementations in scientific machine learning.

Key Features

Methology

Project Image 1

Fig: Tensor schematic of loss computation of FastVPINNs

Results

Project Image 2

Fig: (a) Time taken for PINNs and FastVPINNs to reach an MAE of 5×10−2 for different frequencies. (b) MAE for PINNs and FastVPINNs for different frequencies

Project Image 3

(a) Predicted solution by FastVPINNs after 20 minutes of training on a 2 × 2 domain. (b) Pointwise absolute error of FastVPINNs. (c) Predicted solution by hp-VPINNs after 20 minutes of training on a 8 × 8 domain. (d) Pointwise absolute error of hp-VPINNs.

Back to Projects