|
The new problem was solved with
the nonlinear programming code available in the Excel Solver. The
algorithm embedded in this code, as well as in all nonlinear
solvers, is an iterative procedure that starts with an initial
solution and searches over the decision space until a feasible
solution is reached such that no improvement is possible within
its neighborhood. Informally, this means that if we move
in any direction from the current point, the objective value
will either remain constant or degrade. Such a solution
is called a local maximum because
it has the largest objective value in a local region. In
contrast, a global maximum is a solution that provides the largest objective value among
all feasible solutions.
The Excel worksheet below shows the results of one of several
local maxima found for this problem. |
|
For this example, we
tried five different starting points for the variables P, Q and R. The initial values of all raw
material amounts were then chosen so that the raw material
constraints were satisfied. The results are shown in Table
4 in the rows labeled Final. In the second run, for
example, the initial values were P = 50, Q =
50, R = 50; the algorithm converged to P =
26.7, Q = 40, R =
60 giving Z = 3510. The first run, whose final solution is
illustrate above, was obtained by starting the Solver with
all production quantities equal to 0. In the third run, the
algorithm made no progress. It terminated at the same
point at which it started implying that P = 100, Q =
40, R = 0 is a local optimum.
Table 4. Multiple Local
Maxima for the Convex Objective Function |
|
1 |
2 |
3 |
4 |
5 |
Initial P |
0 |
50 |
100 |
90 |
50 |
Initial Q |
0 |
50 |
40 |
0 |
20 |
Initial R |
0 |
50 |
0 |
50 |
60 |
Final P |
0 |
26.7 |
100 |
90 |
81.8 |
Final Q |
40 |
40 |
40 |
0 |
16.4 |
Final R |
60 |
60 |
0 |
60 |
60 |
Final A |
1000 |
1533 |
2400 |
2400 |
2400 |
Final B |
2080 |
2400 |
2320 |
2040 |
2400 |
Final C |
1200 |
1600 |
1740 |
2310 |
2285 |
Final D |
600 |
867 |
1600 |
900 |
1063 |
Final Z |
3350 |
3510 |
3650 |
3772 |
3787 |
The results in Table 4 indicate the difficulty
encountered when maximizing a convex function (or minimizing
a concave function) regardless of the feasible region. To a
large extent, the path taken by a solution algorithm and the
point to which it converges depends on the initial point chosen.
All the solutions identified in the table are local optima,
but since we don't know if all local optima have been identified,
we may not have found the global optimum. Unfortunately, this
predicament is all too common when trying to optimize nonlinear
functions, and becomes even more disconcerting when there are
nonlinearities in the constraints. As the number of nonlinear
terms in a problem increases, the amount of work required to
find solutions typically goes up at an exponential rate.
The extension of the original linear model to include
nonlinear relationships adds more realism to the analysis,
but one factor is still missing.
We have not included any requirement for integrality of production
and sales. In fact, integer nonlinear programming problems
are more difficult to solve than their linear and nonlinear
counterparts. Most codes for such problems use ad hoc
procedures that combine branch and bound with standard NLP
solvers. |