rc.gpf.models§

Contains extensions to gpflow.models.

Classes§

MOGPR

Gaussian Process Regression.

Module Contents§

class MOGPR(data, kernel, mean_function=None, noise_variance=1.0)§

Bases: gpflow.models.model.GPModel, gpflow.models.training_mixins.InternalDataTrainingLossMixin

Inheritance diagram of rc.gpf.models.MOGPR

Gaussian Process Regression.

This is a vanilla implementation of MOGP regression with a Gaussian likelihood. Multiple columns of Y are treated independently.

The log likelihood of this model is given by

\[\log p(Y \,|\, \mathbf f) = \mathcal N(Y \,|\, 0, \sigma_n^2 \mathbf{I})\]

To train the model, we maximise the log _marginal_ likelihood w.r.t. the likelihood variance and kernel hyperparameters theta. The marginal likelihood is found by integrating the likelihood over the prior, and has the form

\[\log p(Y \,|\, \sigma_n, \theta) = \mathcal N(Y \,|\, 0, \mathbf{KXX} + \sigma_n^2 \mathbf{I})\]
Parameters:
property M§

The input dimensionality.

property L§

The output dimensionality.

log_marginal_likelihood()§

Computes the log marginal likelihood.

\[\log p(Y | \theta).\]
Return type:

tensorflow.Tensor

predict_f(Xnew, full_cov=False, full_output_cov=False)§

This method computes predictions at X in R^{N x D} input points

\[p(F* | Y)\]

where F* are points on the MOGP at new data points, Y are noisy observations at training data points. Note that full_cov => full_output_cov (regardless of the ordinate given for full_output_cov), to avoid ambiguity.

Parameters:
  • Xnew (gpflow.models.model.InputData)

  • full_cov (bool)

  • full_output_cov (bool)

Return type:

gpflow.models.model.MeanAndVariance