培训(气象学)
随机优化
计算机科学
数学优化
牛顿法
数学
人工智能
物理
非线性系统
量子力学
气象学
作者
Uttam Suman,Mariya Mamajiwala,Madhurima Saxena,Ankit Tyagi,Debasish Roy
出处
期刊:Cornell University - arXiv
日期:2024-10-18
标识
DOI:10.48550/arxiv.2410.14270
摘要
Our proposal is on a new stochastic optimizer for non-convex and possibly non-smooth objective functions typically defined over large dimensional design spaces. Towards this, we have tried to bridge noise-assisted global search and faster local convergence, the latter being the characteristic feature of a Newton-like search. Our specific scheme -- acronymed FINDER (Filtering Informed Newton-like and Derivative-free Evolutionary Recursion), exploits the nonlinear stochastic filtering equations to arrive at a derivative-free update that has resemblance with the Newton search employing the inverse Hessian of the objective function. Following certain simplifications of the update to enable a linear scaling with dimension and a few other enhancements, we apply FINDER to a range of problems, starting with some IEEE benchmark objective functions to a couple of archetypal data-driven problems in deep networks to certain cases of physics-informed deep networks. The performance of the new method vis-\'a-vis the well-known Adam and a few others bears evidence to its promise and potentialities for large dimensional optimization problems of practical interest.
科研通智能强力驱动
Strongly Powered by AbleSci AI