Parametric Analysis
After solving a linear programme, you have
sensitivity analysis immediately available to show what happen
happens - Lauren when there are small changes in a
single piece of data. However, it is also useful to see how the solution changes as a
single piece of data varies outside the range determined by sensitivity analysis. This is know as
parametric analysis. Some OR software packages provide specific parametric analysis features, e.g., Storm ??? link here ???
put in this link - Lauren , whereas others can provide fast parametric analysis using looping and "hotstarts" from previous optimal solutions, e.g., AMPL.
Using Storm
src="parametric_storm.jpg"
this jpg isn't displaying - Lauren
This analysis shows how the objective and shadow price change as a right-hand side value increases (from 230 to 500, then 500 upwards
end bracket here - Lauren . It also shows how the objective and shadow price change as the right-hand side value decreases (from 230 to 140, 140 to 120, 120 to 0, then 0 downwards). Similar tables are also available in Storm for changes in the cost coefficients. These tables assume that only one quantity (right-hand side, cost) is changing.
Using AMPL
AMPL provides parametric analysis by allowing data to change dynamically and "hotstarts" from previous optimal solutions. However, AMPL also allows more than one piece of data to change before a hotstart, thus allowing parametric analysis for multiple pieces of data simultaneously.
Emulating Storm's Parametric Analysis
To show you how to create parametric analysis similar to Storm's, we will revisit the
Whiskas Cat Food problem. Using whiskas.mod and whiskas.dat with the following script file gives
sensitivity analysis for the problem:
reset; model whiskas.mod; data whiskas.dat; option presolve 0; # Turn AMPL's presolve off option solver cplex; option cplex_options 'presolve 0 sensitivity'; # Turn CPLEX's presolve off # and sensitivity analysis on solve; display Amount; display Amount.down, Amount.current, Amount.up; display _conname, _con.body, _con.down, _con.current, _con.up, _con.dual;
??? Up to here ???
Mike to look at this from here - Lauren
To build a STORM parametric analysis on the {\tt FAT} requirement we perform repeated
sensitivity analysis on this problem. The initial sensitivity analysis
shows that {\tt
MeetRequirements['FAT']} has an Allowable Min of 4, an Allowable Max of 8 and a Shadow Price of 0.07. This gives the following parts of a STORM parametric analysis:
\begin{verbatim} Shadow From To Price RHS 6 8 0.07 Obj 0.52
RHS 6 4 0.07 Obj 0.52 \end{verbatim}
Setting {\tt Min['FAT']} to be {\tt 8} an resolving gives the objective function value of 0.66 (when {\tt
MeetRequirements['FAT']} is equal to 8). Setting {\tt Min['FAT']} to be just "a little" past 8, i.e., {\tt 8.1}, gives the next part of the parametric analysis:
\begin{verbatim} Shadow From To Price RHS 6 8 0.07 Obj 0.52 0.66
RHS 8 8.33333 0.92 Obj 0.66 0.96666
RHS 6 4 0.07 Obj 0.52 \end{verbatim}
Next, setting {\tt Min['FAT']} to be 8.4 gives the last of the upwards parametric analysis:
\begin{verbatim} Shadow From To Price RHS 6 8 0.07 Obj 0.52 0.66
RHS 8 8.33333 0.92 Obj 0.66 0.96666
RHS 8.33333 Infinity ---- Infeasible in this range ----
RHS 6 4 0.07 Obj 0.52 \end{verbatim}
Repeating this process, but going downwards (to 4 and below) gives the remainder of the table:
\begin{verbatim} Shadow From To Price RHS 6 8 0.07 Obj 0.52 0.66
RHS 8 8.33333 0.92 Obj 0.66 0.96666
RHS 8.33333 Infinity ---- Infeasible in this range ----
RHS 6 4 0.07 Obj 0.52 0.38
RHS 4 -Infinity 0.00 Obj 0.38 0.38 \end{verbatim}
We can also find out the entering and leaving variables from STORM’s parametric analysis by looking at {\tt _var.sstatus} in AMPL (remember that both the AMPL presolve and CPLEX presolve must be turned off).
Parametric Analysis using Loops
While reproducing STORM's parametric is useful, AMPL allows us greater flexibility when doing parametric analysis. We can loop over a number of data parameters values, resolving and examining our solution. This analysis has an advantage over that from STORM because you can 1) change multiple values at once; 2) get the objective function value for all the values you want to try.
For example, the original sensitivity analysis for The Whiskas Cat Food Problem shows that we can change the objective by changing the nutritional requirements for {\tt FAT} and {\tt ONECAN}. Since {\tt ONECAN} ensures we produce a full can, we cannot change this constraint. Let's use a loop to decrease {\tt Min['FAT']}, keeping an eye on the sensitivity analysis at the same time:
\begin{verbatim} option omit_zero_rows 1; for {i in 6..0 by -1} {
let Min['FAT'] := i;
solve; printf "Min['FAT'] = %g\n", i; display Percentage; display _conname, _con.down, _con.current, _con.up, _con.dual; } \end{verbatim}
The {\tt FAT} requirement's dual value (a.k.a shadow price) drops to zero once its {\tt Min} is below {\tt 4}. Thus, no more savings can be made by reducing {\tt Min['FAT']}. However, the dual of the {\tt PROTEIN} requirement becomes non-zero indicating that it now represents an opportunity for some savings. Using a
{\tt for} loop, we can look at the relationship between the objective, {\tt Min['FAT']} and {\tt Min['PROTEIN']}:
\begin{verbatim} set FAT_MINS := 6..0 by -1; set PROTEIN_MINS := 8..0 by -2;
param Objective {FAT_MINS, PROTEIN_MINS};
option omit_zero_rows 1; for {i in FAT_MINS, j in PROTEIN_MINS} {
let Min['FAT'] := i; let Min['PROTEIN'] := j;
solve;
let Objective[i, j] :=
TotalCost; display Percentage; display _conname, _con.down, _con.body, _con.up, _con.dual; } \end{verbatim}
The change in objective shows there is a boundary that defines which of the nutritional analysis requirements has an effect. Note that above the boundary, changing {\tt Min['FAT']} does not have an effect, but changing {\tt Min['PROTEIN']} does. Below this boundary the reverse behaviour is observed.
By being able to perform parametric analysis on two data parameters simultaneously we are able to use AMPL to uncover this behaviour.
--
MichaelOSullivan - 04 Mar 2008