数学
方差减少
稳健性(进化)
蒙特卡罗方法
维数(图论)
应用数学
高斯分布
重要性抽样
数学优化
降维
控制变量
还原(数学)
算法
计算机科学
纯数学
混合蒙特卡罗
人工智能
统计
物理
马尔科夫蒙特卡洛
生物化学
化学
几何学
量子力学
基因
作者
Chaojun Zhang,Xiaoqun Wang,Zhijian He
摘要
We consider integration with respect to a $d$-dimensional spherical Gaussian measure arising from computational finance. Importance sampling (IS) is one of the most important variance reduction techniques in Monte Carlo (MC) methods. In this paper, two kinds of IS are studied in randomized quasi-MC (RQMC) setting, namely, the optimal drift IS (ODIS) and the Laplace IS (LapIS). Traditionally, the LapIS is obtained by mimicking the behavior of the optimal IS density with ODIS as its special case. We prove that the LapIS can also be obtained by an approximate optimization procedure based on the Laplace approximation. We study the promises and limitations of IS in RQMC methods and develop efficient RQMC-based IS procedures. We focus on how to properly combine IS with conditional MC (CMC) and dimension reduction methods in RQMC. In our procedures, the integrands are first smoothed by using CMC. Then the LapIS or the ODIS is performed, where several orthogonal matrices are required to be chosen to reduce the effective dimension. Intuitively, designing methods to determine all these optimal matrices seems infeasible. Fortunately, we prove that as long as the last orthogonal matrix is chosen elaborately, the choices of the other matrices can be arbitrary. This helps to significantly simplify the RQMC-based IS procedure. Due to the robustness and the superiority in efficiency of the gradient principal component analysis (GPCA) method, we use the GPCA method as an effective dimension reduction method in our RQMC-based IS procedures. Moreover, we prove the integrands obtained by the GPCA method are statistically equivalent. Numerical experiments illustrate the superiority of our proposed RQMC-based IS procedures.
科研通智能强力驱动
Strongly Powered by AbleSci AI