next up previous contents index
Next: 3.4 Designs for Linear Up: 3. Design and Analysis Previous: 3.2 Simulation Techniques in


3.3 Black-Box Metamodels of Simulation Models

DOE treats the simulation model as a black box; i.e., only the inputs and outputs are observed and analyzed. For example, in the simulation of the $ t$ statistic (in Sect. 3.2) the simulation inputs (listed in Step 1) are $ \mu$ (mean), $ \sigma ^{2}$ (variance), $ n$ (sample size), and $ m$ (number of macro-replicates); this $ m$ is probably a tactical factor that is not of interest to the user. Suppose the user is interested in the $ 90\,{\%}$ quantile of the distribution function of the statistic in case of nonnormality. A black box representation of this example is:

$\displaystyle t_{n - 1; 0.90} = t(\mu , \sigma , n, r_{0} )\,,$ (3.3)

where $ t(.)$ denotes the mathematical function implicitly defined by the simulation program (outlined in steps 1 through 6 in Sect. 3.2); $ \mu$ and $ \sigma $ now denote the parameters of the nonnormal distribution of the input $ x_{i}$ (for example, $ \mu$ denotes how many exponential distributions with parameter $ \sigma =
\lambda$ are summed to form an Erlang distribution); $ r_{0} $ denotes the seed of the pseudorandom numbers.

One possible metamodel of the black box model in (3.3) is a Taylor series approximation - cut off after the first-order effects of the three factors, $ \mu , \sigma , n$:

$\displaystyle y = \beta _{0} + \beta _{1} \mu + \beta _{2} \sigma + \beta _{3} n+e\,,$ (3.4)

where $ y$ is the metamodel predictor of the simulation output $ t_{n -
1; 0.90} $ in (3.3); $ \boldsymbol{\beta }^{T}=(\beta _{0} , \beta
_{1} , \beta _{2} , \beta _{3} ) $ denotes the parameters of the metamodel in (3.4), and $ e$ is the noise - which includes both lack of fit of the metamodel and intrinsic noise caused by the pseudorandom numbers.

Besides the metamodel specified in (3.4), there are many alternative metamodels. For example, taking the logarithm of the inputs and outputs in (3.4) makes the first-order polynomial approximate relative changes; i.e., the parameters $ \beta_{1}$, $ \beta_{2}$, and $ \beta_{3}$ become elasticity coefficients.

There are many - more complex - types of metamodels. Examples are Kriging models, neural nets, radial basis functions, splines, support vector regression, and wavelets; see the various chapters in Part III - especially Chaps. III.5 (by Loader), III.7 (Müller), III.8 (Cizek), and III.15 (Laskov and Müller) - and also Clarke, Griebsch, and Simpson (2003) and Antioniadis and Pham (1998). I, however, will focus on two types that have established a track record in simulation:

To estimate the parameters of whatever metamodel, the analysts must experiment with the simulation model; i.e., they must change the inputs (or factors) of the simulation, run the simulation, and analyze the resulting input/output data. This experimentation is the topic of the next sections.


next up previous contents index
Next: 3.4 Designs for Linear Up: 3. Design and Analysis Previous: 3.2 Simulation Techniques in