mfann¶
Multivariate Functional Approximation Neural Networks
Overview¶
Multivariate Functional Approximation Neural Networks (MFANN), are an architecture that combines the principles of multivariate functional approximation (MFA) with iterative optimization techniques commonly found in the neural network (NN) literature. MFA is a data modeling, compression, and visualization tool that uses the tensor product of B-spline functions to build continuous, differentiable representations of input data. We extend MFA to use stochastic iterative mini-batch optimization methods, periodically updating the spline-based models instead of numerically solving for the representation. We've demonstrated MFANN is less prone to common problems in neural network optimization, such as overfitting and hyperparameter selection, while remaining flexible enough to fit complex analytical functions and real-world scientific data. Our work highlights MFANN as a promising paradigm for advancing the theory and practice of data-driven function approximation with a new class of neural networks.
Links¶
Citation¶
Please cite the following paper when referencing this work:
Bibtex