Random Fields

A random field H(x) is a real-valued random variable whose statistics (mean value, standard devi-
ation, etc.) may be different for each value of x (Matthies et al., 1997; Matthies and Bucher, 1999),

i. e.,

H є R; x = [X1,X2,… xn]T є D c Rn. (1)

The mean value function is defined as

H (x) = E[H(x)], (2)

whereby the expectation operator E is to be taken at a fixed location x across the ensemble, i. e., over all possible realizations H(x, a>) of the random field (see Figure 1).

The spatial correlation, i. e., the fact that we observe a specific dependency structure of random field values H(x) and H(y) taken at different locations x and y is described by the auto-covariance function

Chh(x, y) = E[{H(x) – H(x)}{H(y) – H(y)}]. (3)

With respect to the form of the auto-covariance function we can classify the random fields. A random field H(x) is called weakly homogeneous if

H(x) = const. Vx є D; CHH(x, x + §) = Chh(§) Vx є D. (4)

471

M. Pandey et al. (eds), Advances in Engineering Structures, Mechanics & Construction, 471-484.

© 2006 Springer. Printed in the Netherlands.

Fig. 1. Ensemble of realizations of one-dimensional random field.

This property is equivalent to the stationarity of a random process. If the covariance function depends on the distance only (not on the direction), i. e.,

Chh(x, x + §) = Сии(ШII) Vx є D, (5)

then a homogeneous random field H(x) is called isotropic. For numerical computations it is useful to represent a continuous random field H(x) in terms of discrete random variables ck; k = 1… to (Ghanem and Spanos, 1991, Brenner and Bucher, 1995):

TO

H(x) = ^2 ck<Pk(x), x є D c R"; ck, фк є R. (6)

k=1

The functions фк (x) are deterministic spatial shape functions which are usually chosen to repres­ent an orthonormal basis on D. The random coefficients ck can be made uncorrelated, which is an extension of orthogonality into the random variable case.

This representation is usually called Karhunen-Loeve Expansion. It is based on the following decomposition of the covariance function:

TO

Chh (x, y) = ^2 Хкфк (x’^k (y), (7)

k=1

in which Xk and фк(x) are the eigenvalues and eigenfunctions, respectively. These are solutions to the integral equation

I Chh(x, y^k(x)dx = Хкфк(у). (8)

D

Mathematically, Equation (8) is an integral equation of the second kind.

In most Finite-Element applications the random field H(x) is discretized right from the start as

Hi = H(xi); i = 1 …N. (9)

A spectral representation for the discretized random field is then obtained by

N N

Hi ‘фk(xl)ck ^ ‘<pikck.

k=1 k=1

Obviously, this is a matrix-vector multiplication

H = Фс. (11)

The orthogonality condition for the columns of Ф becomes

Фт Ф = I (12)

and the covariance matrix of the components of the coefficient vector c is

Ccc = diag(o-2). (13)

Both conditions can be met if the columns фк of the matrix Ф solve the following eigenvalue problem:

СннФк = Фк; к = 1 …N. (14)

Statistically, the Karhunen-Loeve expansion is equivalent to a representation of the random field by means of a Principal Component Analysis (PCA).

There are engineering applications in which the values of a structural property are known (e. g. from measurements) in certain selected locations. In geotechnical applications this may be a specific soil property which can be determined through bore holes. Between these locations, however, a random variability is assumed. The strategy to deal with this relies on a regression approach. First we assume that the structural property under consideration without any measurements can be modeled by a zero mean random field H(x). This field is modified into H (x) by taking into account the additional knowledge.

Assume that the values of the random field H(x) are known at the locations xk, к = 1.. .m. We then write a stochastic interpolation for the conditional random field:

m

H (xi) = a(x) + J2 Ьк(х)Н(хк) (15)

к=1

in which a(x) and Ьк (x) are random interpolating functions whose statistics have yet to be determ­ined. They are chosen to make the mean value of the difference between the random field and the conditional field zero, i. e. E[H(x) — H(x)] = 0 and to minimize the variance of the difference, i. e. E[(H(x) — H(x))2] ^ Min.

Carrying out the analysis we obtain an expression for the mean value of the conditional random field.

H(xm)

In this equation, the matrix CHH denotes the covariance matrix of the random field H(x) at the locations of the measurements. The covariance matrix of the conditional random field is given by

CHH(y, xm)