Relative Residual Resampling: An Adaptive Residual-Based Sampling Methods in Training PINNs

More Info
expand_more

Abstract

Physics-Informed Neural Networks (PINNs) offer a promising approach to solving partial differential
equations (PDEs). In PINNs, physical laws are incorporated into the loss function, guiding the network to learn a model that adheres to these laws as defined by the PDEs. Training PINNs involves using three types of points: collocation points within the domain of the PDEs, boundary points on the edges of the domain, and initial points at the starting time (t = 0). A common challenge in training PINNs is the imbalance between the number of boundaries and initial points compared to collocation points, which can negatively impact the training process. Typically, the number of each type of point is manually set, and the number of collocation points is usually ten times that of boundary and initial points, leading to an uneven loss distribution over time. Our work introduces a method called Relative Residual Resampling (R3) to address this issue. This method dynamically adjusts the number of each type of training point to ensure a more balanced distribution. Consequently, the loss is more evenly spread across the time domain, enhancing the performance of the PINN. We tested our method on the 1D Heat Equation and 1D Diffusion Equation. The results demonstrate that our approach reduces the overall loss by at least 6%, and balances the loss distribution over time by reducing the loss range for at least 65% compared with the state-of-the-art residual-based resampling strategy. This improvement makes PINNs more accurate and efficient for solving time-dependent PDEs, providing a practical solution for applications where understanding temporal dynamics is crucial.

Files

Unknown license