Memory-based Stochastic Optimization

Part of Advances in Neural Information Processing Systems 8 (NIPS 1995)

Bibtex Metadata Paper

Authors

Andrew Moore, Jeff Schneider

Abstract

In this paper we introduce new algorithms for optimizing noisy plants in which each experiment is very expensive. The algorithms build a global non-linear model of the expected output at the same time as using Bayesian linear regression analysis of locally weighted polynomial models. The local model answers queries about confi(cid:173) dence, noise, gradient and Hessians, and use them to make auto(cid:173) mated decisions similar to those made by a practitioner of Response Surface Methodology. The global and local models are combined naturally as a locally weighted regression. We examine the ques(cid:173) tion of whether the global model can really help optimization, and we extend it to the case of time-varying functions. We compare the new algorithms with a highly tuned higher-order stochastic op(cid:173) timization algorithm on randomly-generated functions and a sim(cid:173) ulated manufacturing task. We note significant improvements in total regret , time to converge, and final solution quality.