Abstract:
Gaussian process (GP) model is a paradigmatic machine learning model that combines the advantages of both kernel learning method and Bayesian inference mechanism, and thus has become a very popular area in machine learning in recent years. As an extension of the GP model, the Mixture of Gaussian Processes (MGP) fits datasets more effectively and thus it has a better ability of learning and generalization. However, there are only some isolated literatures and reports about the GP and MGP models and no systematic summary on these models. In this paper, we begin to review the GP model and its basic principles and developments on various aspects. We then discuss how to extend the GP model to the MGP model and further review the status and developments of the MGP models, and finally point out some prospective research directions and interesting applications of the MGP model.