ZHOU E-Tong, CHEN Zi-Yi, MA Jin-Wen. From Gaussian Processes to the Mixture of Gaussian Processes:A Survey[J]. JOURNAL OF SIGNAL PROCESSING, 2016, 32(8): 960-972. DOI: 10.16798/j.issn.1003-0530.2016.08.11
Citation: ZHOU E-Tong, CHEN Zi-Yi, MA Jin-Wen. From Gaussian Processes to the Mixture of Gaussian Processes:A Survey[J]. JOURNAL OF SIGNAL PROCESSING, 2016, 32(8): 960-972. DOI: 10.16798/j.issn.1003-0530.2016.08.11

From Gaussian Processes to the Mixture of Gaussian Processes:A Survey

  • Gaussian process (GP) model is a paradigmatic machine learning model that combines the advantages of both kernel learning method and Bayesian inference mechanism, and thus has become a very popular area in machine learning in recent years. As an extension of the GP model, the Mixture of Gaussian Processes (MGP) fits datasets more effectively and thus it has a better ability of learning and generalization. However, there are only some isolated literatures and reports about the GP and MGP models and no systematic summary on these models. In this paper, we begin to review the GP model and its basic principles and developments on various aspects. We then discuss how to extend the GP model to the MGP model and further review the status and developments of the MGP models, and finally point out some prospective research directions and interesting applications of the MGP model.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return