This paper introduces the nested heteroscedastic Gaussian process approach (NHGP) to tackle simulation metamodeling with large-scale heteroscedastic datasets. NHGP achieves scalability by aggregating sub-stochastic kriging (SK) models built on disjoint subsets of a large-scale dataset, making it user-friendly for SK users. We show that the NHGP predictor possesses desirable statistical properties, including being the best linear unbiased predictor among those built by aggregating sub-SK models and being consistent. The numerical experiments demonstrate the competitive performance of NHGP.
We provide convergence analyses of prediction errors for stochastic kriging (SK) under two scenarios. First, we examine the case where the kernel smoothness is potentially misspecified and establish the determined convergence rate of the mean square error. We demonstrate that the optimal rate is achieved when the kernel is smoother than the true function, while the worst rate is primarily influenced by the order of the noise variance. Second, we analyze the high-dimensional setting and provide the high probability rate of the error with L∞ norm. Our analysis shows that the worst convergence rate, influenced by noise variance, can be improved through experimental design, while in high-dimensional cases, we reduce the impact of dimensionality on the rate.
Jin Zhao and Xi Chen, "Nested Heteroscedastic Gaussian Process for Simulation Metamodeling," Proceedings of the 2024 Winter Simulation Conference, to appear.
Jin Zhao and Xi Chen, "Convergence Analysis of Stochastic Kriging Predictor Under Possible Kernel Misspecification," Manuscript in preparation.