|
To illustrate the DP Solver we use an example
similar to the one used in the description of the Markov
chain model. Now however, we will have two maintenance options.
We are concerned about
the maintenance policy for light bulbs in a large lighted
sign. The manager wants a policy that produces the
smallest discounted net present worth for operating
the sign.
The manager can inspect each bulb at the end of
every month of its life at a cost of $0.50 per inspection.
If the bulb is found to be failed, it is replaced
at the beginning of the next month for a cost of
$2.
Alternatively, the manager may simply replace the
bulb without inspection based on its age. Non-failed
bulbs may be discarded by this process, but the price
of a new bulb is reduced by $0.30 if it is replaced
by this alternative maintenance procedure. The cost
reduction is due to the larger number of bulbs that
will be purchased, resulting in economies of scale.
The $0.50 inspection cost is not expended with the
replace option. |
|
The Situation with
Inspection |
|
With the inspection option,
the state/transition diagram is shown below. The states are shown
as the circles (or nodes) in the figure. The number in the
node is the age of the bulb. A new bulb has the age 0 and the
remaining ages are 1, 2, 3, and 4 months. The directed line
segments (or arcs) are the transitions between states. The
black arcs indicate aging of the bulb and the red arcs indicating
the requirement of replacing the bulb when it is failed.
At the end of each month, the bulb is
inspected. If the bulb is failed it is replaced at the beginning
of the next month and the system moves to the New state.
If the bulb survives, it becomes one month older. The
transition probabilities show the effects of aging. The new
bulb exhibits the "infant mortality" characteristic.
Perhaps due to manufacturing defects, the probability that
a new bulb fails during its first month is greater than the
probability that a one month old bulb fails during its second
month of life. As the bulb ages further, the probability
of failure grows. In the fourth month, the likelihood of
failure reaches its highest value. The bulb may survive beyond
four months, but the probability of failure given that it
survives after the fourth month remains constant. |
|
|
The matrices below the network show the transition
probabilities and state costs. In each state there are two probabilities,
the probability that the bulb will fail and have to be replaced
and the probability that the bulb will not fail and become one
month older. The problem is limited to four months, so we assume
that with a not-fail event at month 4 leaves the system in state
4.
At each age the bulb is inspected at the cost of $0.50. The
cost of a new bulb, $2, is assigned to state New.
Every time the system enters the New state a bulb
is purchased. |
|
The situation can be analyzed by the Markov
Analysis add-in. The transition
matrix is placed directly on the Matrix page of Markov
chain model as below.
|
|
|
The
economics of the situation are shown in the Economic
Data worksheet. Note that we are using a discount rate of
1% per month. |
|
|
The steady state analysis
shown below reveals interesting information regarding the bulb.
Rounding values to the nearest whole percentages and nearest
whole cents, we see that a new bulb is purchased in 43% of the
months. The average cost of maintaining the bulb is $1.37 per
month. Starting from the New state,
the discounted cost (net present worth or NPW) using the interest
rate of 1% per month is $138.90. The NPW is important
in this context because it is the measure that will be minimized
with the DP model. |
|
|
Replacement, An Alternative to Inspection |
|
Rather than inspect
each month, we could adopt a policy of replacing the bulb without
inspection. This is the replace action. For example,
if we adopt the replace option for the new state, the
bulb will be replaced at the end of the first month without inspection.
A working bulb may be discarded by the replace option, but the
inspection cost is not incurred. Also we assume
that new bulbs will cost $0.30 less with the replace
option, resulting a unit cost of $1.70. The transition network
for this system is shown in the figure. |
|
|
With the replace option, regardless
of the starting state, the system moves immediately to the new-bulb state
and remains there. The first column of the Markov transition
matrix is filled with the probability 1, while the other columns
have all 0's. This trivial Markov Chain has a cost per
period of $1.70. The expected NPW of this chain is 1.70 + (1.70/0.01)
= $171.70. The replace option is clearly more expensive than
the inspect option. |
To Inspect or Not Inspect? That is the question. |
|
We might ask whether a mixed strategy
is better than either always inspecting or always replacing.
It may be that it is cheaper to inspect in some states and replace
in others. This is the question addressed by the Markov Decision
Process. We use the DP Solver to model and solve the problem
on the next page, but show the results below. The optimum solution
is to inspect when in states 0 through 2, and to replace in states
3 and 4. |
|
|
The NPW of this solution is 134.47,
less expensive than the policy of inspecting at every state.
State 4 is a transient state. Once the process leaves state
4 it will never return. |
|
|