|
In many practical situations the attributes of a system randomly
change over time. Examples
include the number of customers in a checkout line, congestion
on a highway, the number of items in a warehouse, and the price
of a financial security, to name a few.
In certain instances, it is possible to describe an underlying
process that explains how the variability occurs.
When aspects of the process are governed by probability
theory, we have a stochastic process.
The first step in modeling a dynamic process
is to define the set of states over which it can range and to
characterize the mechanisms that govern its transitions.
The state is like a snapshot of the system at a point
in time. It is an abstraction of reality that describes
the attributes of the system of interest. Time is the linear measure through which the system moves,
and can be thought of as a parameter.
Because of time there is a past, present, and future. We usually know the trajectory a system
has followed to arrive at its present state. Using this information, our goal is to predict the future behavior
of the system in terms of a basic a set of attributes. As we shall see, a variety of analytic
techniques are available for this purpose.
From a modeling point of view, state and time
can be treated as either continuous or discrete.
Both theoretical and computational considerations, however,
argue in favor of the discrete state case so this will be our
focus. We consider both discrete time and continuous time models.
To obtain computational tractibility we assume that the stochastic
process satisfies the Markov condition. That is, the path the
process takes in the future depends only on the current state,
and not the sequence of states visited prior to the current
state. For the discrete time system this leads to the Markov
Chain model. For the continous time system the model is called a Markov
Process.
The model of a stochastic process describes
activities that culminate in events.
The events cause a transition from one state to another. Because activity durations are assumed to be continuous random
variables, events occur in the continuum of time. This section provides the vocabulary used
in conjunction with a continuous time stochastic processes along
with an example in which the model is useful. Two other sections
in this Modeling part of the site will focus respectively on
Markov Chains and Markov Processes.
|