6 Multivariate distributions

The namespace

using namespace density;

gives access to a variety of multivariate normal distributions:

  • Multivariate normal distributions specified via a covariance matrix (structured or unstructured).
  • Autoregressive (AR) processes.
  • Gaussian Markov random fields (GMRF) defined on regular grids or defined via a (sparse) precision matrix.
  • Separable covariance functions, i.e. time-space separability.

These seemingly unrelated concepts are all implemented via the notion of a distribution, which explains why they are placed in the same namespace. You can combine two distributions, and this lets you build up complex multivariate distributions using extremely compact notation. Due to the flexibility of the approach it is more abstract than other parts of TMB, but here it will be explained from scratch. Before looking at the different categories of multivariate distributions we note the following which is of practical importance:

  • All members in the density namespace return the negative log density, opposed to the univariate densities in R style distributions.

6.1 Multivariate normal distributions

Consider a zero-mean multivariate normal distribution with covariance matrix Sigma (symmetric positive definite), that we want to evaluate at x:

int n = 10;
vector<Type> x(n);           // Evaluation point           
x.fill(0.0);                 // Point of evaluation: x = (0,0,...,0)

The negative log-normal density is evaluated as follows:

using namespace density;
matrix<Type> Sigma(n,n);     // Covariance matrix
// ..... User must assign value to Sigma here
res = MVNORM(Sigma)(x);      // Evaluate negative log likelihod

In the last line MVNORM(Sigma) should be interpreted as a multivariate density, which via the last parenthesis (x) is evaluated at x. A less compact way of expressing this is

MVNORM_t<Type> your_dmnorm(Sigma);
res = your_dmnorm(x);

in which your_dmnorm is a variable that holds the “density.”

Note, that the latter way (using the MVNORM_t) is more efficient when you need to evaluate the density more than once, i.e. for different values of x.

Sigma can be parameterized in different ways. Due to the symmetry of Sigma there are at most n(n+1)/2 free parameters (n variances and n(n-1)/2 correlation parameters). If you want to estimate all of these freely (modulo the positive definite constraint) you can use UNSTRUCTURED_CORR() to specify the correlation matrix, and VECSCALE() to specify variances. UNSTRUCTURED_CORR() takes as input a vector a dummy parameters that internally is used to build the correlation matrix via its cholesky factor.

using namespace density;
int n = 10;
vector<Type> unconstrained_params(n*(n-1)/2);  // Dummy parameterization of correlation matrix
vector<Type> sds(n);                           // Standard deviations
res = VECSCALE(UNSTRUCTURED_CORR(unconstrained_params),sds)(x);

If all elements of dummy_params are estimated we are in effect estimating a full correlation matrix without any constraints on its elements (except for the mandatory positive definiteness). The actual value of the correlation matrix, but not the full covariance matrix, can easily be assessed using the .cov() operator

matrix<Type> Sigma(n,n);
Sigma = UNSTRUCTURED_CORR(unconstrained_params).cov();
REPORT(Sigma);                                         // Report back to R session

6.2 Autoregressive processes

Consider a stationary univariate Gaussian AR1 process x(t),t=0,…,n-1. The stationary distribution is choosen so that:

  • x(t) has mean 0 and variance 1 (for all t).

The multivariate density of the vector x can be evaluated as follows

int n = 10;
using namespace density;
 
vector<Type> x(n);           // Evaluation point
x.fill(0.0);                 // Point of evaluation: x = (0,0,...,0)
Type rho = 0.2;              // Correlation parameter
res = AR1(rho)(x);           // Evaluate negative log-density of AR1 process at point x 

Due to the assumed stationarity the correlation parameter must satisfy:

  • Stationarity constraint: -1 < rho < 1

Note that cor[x(t),x(t-1)] = rho.

The SCALE() function can be used to set the standard deviation.

Type sigma = 2.1;            // standard deviation of x
res = SCALE(AR1(rho),sigma)(x);

Now, var[x(t)] = sigma^2. Because all elements of x are scaled by the same constant we use SCALE rather than VECSCALE.

6.2.0.1 Multivariate AR1 processes

This is the first real illustration of how distributions can be used as building blocks to obtain more complex distributions. Consider the p dimensional AR1 process

int n = 10;                   // Number of time steps
int p=3;                      // dim(x)
array<Type> x(p,n);           // Evaluation point

The columns in x refer to the different time points. We then evaluate the (negative log) joint density of the time series.

MVNORM_t<Type> your_dmnorm(Sigma);  // Density of x(t) 
Type phi;                           // Correlation parameter
res = AR1(phi,your_dmnorm)(x);

Note the following:

  • We have introduced an intermediate variable your_dmnorm, which holds the p-dim density marginal density of x(t). This is a zero-mean normal density with covariance matrix Sigma.
  • All p univarite time series have the same serial correlation phi.
  • The multivariate process x(t) is stationary in the same sense as the univariate AR1 process described above.

6.2.0.2 Higher order AR processes

There also exists ARk_t of arbitrary autoregressive order.

6.3 Gaussian Markov random fields (GMRF)

GMRF may be defined in two ways:

  1. Via a (sparse) precision matrix Q.
  2. Via a d-dimensional lattice.

For further details please see GMRF_t. Under 1) a sparse Q corresponding to a Matern covariance function can be obtained via the R_inla namespace.

6.4 Separable construction of covariance (precision) matrices

A typical use of separability is to create space-time models with a sparse precision matrix. Details are given in SEPARABLE_t. Here we give a simple example.

Assume that we study a quantity x that changes both in space and time. For simplicity we consider only a one-dimensional space. We discretize space and time using equidistant grids, and assume that the distance between grid points is 1 in both dimensions. We then define an AR1(rho_s) process in space and one in time AR1(rho_t). The separable assumption is that two points x1 and x2, separated in space by a distance ds and in time by a distance dt, have correlation given by

rho_s^ds*rho_t^dt

This is implemented as

using namespace density;
int n_s = 10;                   // Number of grid points in space
int n_t = 10;                   // Number of grid points in time
Type rho_s = 0.2;               // Correlation in space
Type rho_t = 0.4;               // Correlation in time

array<Type> x(n_s,n_t);
x.setZero();                    // x = 0

res = SEPARABLE(AR1(rho_t),AR1(rho_s))(x);

Note that the arguments to SEPARABLE() are given in the opposite order to the dimensions of x.