Authors: Anshuman Sinha, Spencer H. Bryngelson (Georgia Tech)
Published at TMLR (2024) ISSN 2835-8856, link: https://openreview.net/pdf?id=5psgQEHn6t
We demonstrate that neural networks can be FLOP-efficient integrators of one-dimensional oscillatory integrands. We train a feed-forward neural network to compute integrals of highly oscillatory 1D functions. The training set is a parametric combination of functions with varying characters and oscillatory behavior degrees. Numerical examples show that these networks are FLOP-efficient for sufficiently oscillatory integrands with an average FLOP gain of
$10^3$ FLOPs. The network calculates oscillatory integrals better than traditional quadrature methods under the same computational budget or number of floating point operations. We find that feed-forward networks of 5 hidden layers are satisfactory for a relative accuracy of$10^{-3}$ . The computational burden of inference of the neural network is relatively small, even compared to inner-product pattern quadrature rules. We postulate that our result follows from learning latent patterns in the oscillatory integrands that are otherwise opaque to traditional numerical integrators.
- DeepXDE (
pip install deepxde
) - Tensorflow >= 3.8 (
pip install tensorflow
)
To reproduce our results:
git clone https://github.com/comp-physics/deepOscillations.git
cd deepOscillations
bash run.sh
- Choose the desired function, for example
func_str='Levin1'
- Set the desired
n_array= (_) b_array=(_) s_array=(_)
values in therun.sh
script - Execute
bash run.sh
python3 collect_results.py
python3 plot_result.py
The authors appreciate discussion with Dr. Ethan Pickering at an early stage of this work.
MIT