Difference between revisions of "OptShadow"
m (→See Also) |
|||
(5 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
[[Category:Doc Status D]] <!-- For Lumina use, do not change --> | [[Category:Doc Status D]] <!-- For Lumina use, do not change --> | ||
− | |||
− | |||
− | |||
+ | == OptShadow(opt) == | ||
Returns the shadow prices, or dual values, for the constraints at the optimal solution. | Returns the shadow prices, or dual values, for the constraints at the optimal solution. | ||
Line 11: | Line 9: | ||
The shadow price is the amount by which the objective function changes when the constraint is altered by increasing its right-hand side coefficient, ''b<sub>i</sub>'', by one unit. It is valid only for small changes in ''b<sub>i</sub>'', and mathematically is defined as: | The shadow price is the amount by which the objective function changes when the constraint is altered by increasing its right-hand side coefficient, ''b<sub>i</sub>'', by one unit. It is valid only for small changes in ''b<sub>i</sub>'', and mathematically is defined as: | ||
− | [[Image:Shadow_price_eq.png]] | + | :[[Image:Shadow_price_eq.png]] |
This is the partial derivative of the objective function relative to the constraint RHS coefficient. | This is the partial derivative of the objective function relative to the constraint RHS coefficient. | ||
− | For a '<=' constraint and maximization problem, a shadow price indicates the amount the objective function improves when the constraint is relaxed. Shadow prices are usually meaningful when you are thinking of them in these terms. However, you should pay attention to the partial derivative definition to get the sign right. | + | For a <code>'<='</code> constraint and maximization problem, a shadow price indicates the amount the objective function improves when the constraint is relaxed. Shadow prices are usually meaningful when you are thinking of them in these terms. However, you should pay attention to the partial derivative definition to get the sign right. |
The shadow can only be computed for continuous optimization problems. The shadow price does not exist for integer or mixed-integer optimizations, so can only be computed if every variable in the optimization problem is continuous. | The shadow can only be computed for continuous optimization problems. The shadow price does not exist for integer or mixed-integer optimizations, so can only be computed if every variable in the optimization problem is continuous. | ||
− | For continuous problems, whether the shadow price can be computed depends upon the problem type and solver engine used. The following table summarizes the combinations for which shadow price can be computed (QP=quadratic objective + linear constraints, QCP = quadratically constrained): | + | For continuous problems, whether the shadow price can be computed depends upon the problem type and solver engine used. The following table summarizes the combinations for which shadow price can be computed (QP = quadratic objective + linear constraints, QCP = quadratically constrained): |
− | {| | + | :{| class="wikitable" style="text-align:center" |
! | ! | ||
! colspan=4 | "Problem Type" | ! colspan=4 | "Problem Type" | ||
Line 65: | Line 63: | ||
|} | |} | ||
− | This table may not be 100% accurate. [To do: | + | This table may not be 100% accurate. [To do: empirically validate these entries] |
− | |||
− | |||
− | When you relax a constraint, the objective will always either improve or stay the same. Thus, for a minimization problem, the shadow price of a '<' constraint is always negative or zero, and for a maximization problem it is positive or zero. For a '>' constraint the opposite holds. | + | == Notes == |
+ | When you relax a constraint, the objective will always either improve or stay the same. Thus, for a minimization problem, the shadow price of a <code>'<'</code> constraint is always negative or zero, and for a maximization problem it is positive or zero. For a <code>'>'</code> constraint the opposite holds. | ||
Constraints with zero shadow prices have [[OptSlack|slack]] -- that is, they are not constraining the optimal solution. | Constraints with zero shadow prices have [[OptSlack|slack]] -- that is, they are not constraining the optimal solution. | ||
Line 75: | Line 72: | ||
Not all linear programming packages use the same convention for the sign of shadow prices. The LINDO package, for example, uses a different convention for the sign. | Not all linear programming packages use the same convention for the sign of shadow prices. The LINDO package, for example, uses a different convention for the sign. | ||
− | = | + | ==History== |
+ | This function was introduced in Analytica 4.3 and was formerly named [[LpShadow]]. | ||
+ | == See Also == | ||
+ | <div style="column-count:2;-moz-column-count:2;-webkit-column-count:2"> | ||
+ | * [[OptSlack]] | ||
+ | * [[OptScalarToConstraint]] | ||
+ | * [[OptScalarToDecision]] | ||
* [[OptRhsSa]] | * [[OptRhsSa]] | ||
* [[OptReducedCost]] | * [[OptReducedCost]] | ||
* [[DefineOptimization]] (former to Analytica 4.3, see [[LpDefine]]) | * [[DefineOptimization]] (former to Analytica 4.3, see [[LpDefine]]) | ||
− | * [[OptObjective]] | + | * [[OptObjective]] |
+ | * [[OptSolution]] | ||
+ | * [[OptStatusText]] | ||
+ | * [[OptStatusNum]] | ||
+ | </div> |
Latest revision as of 22:24, 22 January 2016
OptShadow(opt)
Returns the shadow prices, or dual values, for the constraints at the optimal solution.
The shadow price is the amount by which the objective function changes when the constraint is altered by increasing its right-hand side coefficient, bi, by one unit. It is valid only for small changes in bi, and mathematically is defined as:
This is the partial derivative of the objective function relative to the constraint RHS coefficient.
For a '<='
constraint and maximization problem, a shadow price indicates the amount the objective function improves when the constraint is relaxed. Shadow prices are usually meaningful when you are thinking of them in these terms. However, you should pay attention to the partial derivative definition to get the sign right.
The shadow can only be computed for continuous optimization problems. The shadow price does not exist for integer or mixed-integer optimizations, so can only be computed if every variable in the optimization problem is continuous.
For continuous problems, whether the shadow price can be computed depends upon the problem type and solver engine used. The following table summarizes the combinations for which shadow price can be computed (QP = quadratic objective + linear constraints, QCP = quadratically constrained):
"Problem Type" Engine LP QP QCP NLP LP/Quadratic Y - - - SOCP Barrier Y Y N - GRG Nonlinear Y Y Y Y Evolutionary N N N N LSLP Y - - - LSGRG Y Y Y Y LSSQP Y Y Y - OptQuest N N N N MOSEK Y Y N N Knitro Y Y Y Y XPress Y - - - Gurobi Y - - -
This table may not be 100% accurate. [To do: empirically validate these entries]
Notes
When you relax a constraint, the objective will always either improve or stay the same. Thus, for a minimization problem, the shadow price of a '<'
constraint is always negative or zero, and for a maximization problem it is positive or zero. For a '>'
constraint the opposite holds.
Constraints with zero shadow prices have slack -- that is, they are not constraining the optimal solution.
Not all linear programming packages use the same convention for the sign of shadow prices. The LINDO package, for example, uses a different convention for the sign.
History
This function was introduced in Analytica 4.3 and was formerly named LpShadow.
See Also
- OptSlack
- OptScalarToConstraint
- OptScalarToDecision
- OptRhsSa
- OptReducedCost
- DefineOptimization (former to Analytica 4.3, see LpDefine)
- OptObjective
- OptSolution
- OptStatusText
- OptStatusNum
Enable comment auto-refresher