Results
A technical manual for querying optimized InfiniteOpt
models. See the respective guide for more information.
Statuses
JuMP.termination_status
— MethodJuMP.termination_status(model::InfiniteModel)
Extend JuMP.termination_status
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.raw_status
— MethodJuMP.raw_status(model::InfiniteModel)
Extend JuMP.raw_status
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.primal_status
— MethodJuMP.primal_status(model::InfiniteModel; [result::Int = 1])
Extend JuMP.primal_status
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.dual_status
— MethodJuMP.dual_status(model::InfiniteModel; [result::Int = 1])
Extend JuMP.dual_status
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
General
JuMP.solve_time
— MethodJuMP.solve_time(model::InfiniteModel)
Extend JuMP.solve_time
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.simplex_iterations
— MethodJuMP.simplex_iterations(model::InfiniteModel)
Extend JuMP.simplex_iterations
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.barrier_iterations
— MethodJuMP.barrier_iterations(model::InfiniteModel)
Extend JuMP.barrier_iterations
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.node_count
— MethodJuMP.node_count(model::InfiniteModel)
Extend JuMP.node_count
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.result_count
— MethodJuMP.result_count(model::InfiniteModel)
Extend result_count
to return the number of results available to query after a call to optimize!
.
Example
julia> result_count(model)
1
Objective
JuMP.objective_bound
— MethodJuMP.objective_bound(model::InfiniteModel)
Extend JuMP.objective_bound
for InfiniteModel
s in accordance with that reported by its optimizer model. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.objective_value
— MethodJuMP.objective_value(model::InfiniteModel; [result::Int = 1])
Extend JuMP.objective_value
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.dual_objective_value
— MethodJuMP.dual_objective_value(model::InfiniteModel; [result::Int = 1])
Extend JuMP.dual_objective_value
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
Variables
JuMP.has_values
— MethodJuMP.has_values(model::InfiniteModel; [result::Int = 1])
Extend JuMP.has_values
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.value
— MethodJuMP.value(vref::GeneralVariableRef; [result::Int = 1,
label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.value
to return the value(s) of vref
in accordance with its reformulation variable(s) stored in the optimizer model and the result index result
of the most recent solution obtained. Use JuMP.has_values
to check if a result exists before asking for values.
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the values associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the values of infinite variables are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the variable has multiple infinite parameter dependencies.
To provide context for the results it may be helpful to also query the variable's parameter_refs
and supports
which will have a one-to-one correspondence with the value(s). It may also be helpful to query via optimizer_model_variable
to retrieve the variables(s) that these values are based on. These functions should all be called with the same keyword arugments for consistency.
For extensions, this only works if optimizer_model_variable
has been extended correctly and/or map_value
has been extended for variables.
Example
julia> value(z)
42.0
JuMP.reduced_cost
— MethodJuMP.reduced_cost(vref::GeneralVariableRef)
Extend JuMP.reduced_cost
. This returns the reduced cost(s) of a variable. This will be a vector of scalar values for an infinite variable or will be a scalar value for finite variables.
Example
julia> reduced_cost(x)
12.81
JuMP.optimizer_index
— MethodJuMP.optimizer_index(vref::GeneralVariableRef;
[label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.optimizer_index
to return the MathOptInterface
index(es) of vref
in accordance with its reformulation variable(s) stored in the optimizer model.
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the optimizer indices associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the indices of infinite variables are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the variable has multiple infinite parameter dependencies.
It may also be helpful to query via optimizer_model_variable
to retrieve the variables(s) that these indices are based on. These should use the same keyword arguments for consistency.
For extensions, this only works if optimizer_model_variable
has been extended correctly and/or map_optimizer_index
has been extended for variables.
Example
julia> optimizer_index(x)
4-element Array{MathOptInterface.VariableIndex,1}:
MathOptInterface.VariableIndex(2)
MathOptInterface.VariableIndex(3)
MathOptInterface.VariableIndex(4)
MathOptInterface.VariableIndex(5)
InfiniteOpt.map_value
— Functionmap_value([ref/expr], key::Val{ext_key_name}, result::Int; kwargs...)
Map the value(s) of ref
to its counterpart in the optimizer model type that is distininguished by its extension key key
as type Val{ext_key_name}
. Here ref
need refer to methods for both variable references and constraint references. This only needs to be defined for reformulation extensions that cannot readily extend optimizer_model_variable
, optimizer_model_expression
, and/or optimizer_model_constraint
. Such as is the case with reformuations that do not have a direct mapping between variables and/or constraints in the original infinite form. Otherwise, optimizer_model_variable
, optimizer_model_expression
, and optimizer_model_constraint
are used to make these mappings by default where kwargs
are passed on these functions. Here result
is the result index used in value
.
InfiniteOpt.map_reduced_cost
— Functionmap_reduced_cost(vref::GeneralVariableRef, key::Val{ext_key_name},
result::Int; kwargs...)
Map the reduced cost(s) of vref
to its counterpart in the optimizer model type that is distininguished by its extension key key
as type Val{ext_key_name}
. This only needs to be defined for reformulation extensions that cannot readily extend optimizer_model_variable
. Such as is the case with reformulations that do not have a direct mapping between variables in the original infinite form. Otherwise, optimizer_model_variable
, is used to make these mappings by default where kwargs
are passed on these functions. Here result
is the result index used in value
.
InfiniteOpt.map_optimizer_index
— Functionmap_optimizer_index(ref, key::Val{ext_key_name}; kwargs...)
Map the MathOptInterface
index(es) of ref
to its counterpart in the optimizer model type that is distininguished by its extension key key
as type Val{ext_key_name}
. Here ref
need refer to methods for both variable references and constraint references. This only needs to be defined for reformulation extensions that cannot readily extend optimizer_model_variable
and optimizer_model_constraint
. Such as is the case with reformuations that do not have a direct mapping between variables and/or constraints in the original infinite form. Otherwise, optimizer_model_variable
and optimizer_model_constraint
are used to make these mappings by default where kwargs
are passed on as well.
Constraints
JuMP.has_duals
— MethodJuMP.has_duals(model::InfiniteModel; [result::Int = 1])
Extend JuMP.has_duals
for InfiniteModel
s in accordance with that reported by its optimizer model and the result index result
of the most recent solution obtained. Errors if such a query is not supported or if the optimizer model hasn't be solved.
JuMP.value
— MethodJuMP.value(cref::InfOptConstraintRef; [result::Int = 1,
label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.value
to return the value(s) of cref
in accordance with its reformulation constraint(s) stored in the optimizer model and the result index result
of the most recent solution obtained. Use JuMP.has_values
to check if a result exists before asking for values.
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the values associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the values of infinite constraints are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the constraint has multiple infinite parameter dependencies.
To provide context for the results it may be helpful to also query the constraint's parameter_refs
and supports
which will have a one-to-one correspondence with the value(s). It may also be helpful to query via optimizer_model_constraint
to retrieve the constraint(s) that these values are based on. By default, only the values corresponding to public supports are returned. These functions should all be called with the same keyword arugments for consistency.
For extensions, this only works if optimizer_model_constraint
has been extended correctly and/or map_value
has been extended for constraints.
Example
julia> value(c1)
4-element Array{Float64,1}:
-0.0
20.9
20.9
20.9
JuMP.optimizer_index
— MethodJuMP.optimizer_index(cref::InfOptConstraintRef;
[label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.optimizer_index
to return the MathOptInterface
index(es) of cref
in accordance with its reformulation constraints(s) stored in the optimizer model.
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the optimizer indices associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the indices of infinite constraints are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the constraint has multiple infinite parameter dependencies.
It may also be helpful to query via optimizer_model_constraint
to retrieve the constraints(s) that these indices are based on. The same keyword arguments should be used for consistency.
For extensions, this only works if optimizer_model_constraint
has been extended correctly and/or map_optimizer_index
has been extended for constraints.
Example
julia> optimizer_index(c1)
4-element Array{MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},1}:
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(1)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(2)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(3)
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(4)
JuMP.dual
— MethodJuMP.dual(cref::InfOptConstraintRef; [result::Int = 1,
label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.dual
to return the dual(s) of cref
in accordance with its reformulation constraint(s) stored in the optimizer model and the result index result
of the most recent solution obtained. Use JuMP.has_duals
to check if a result exists before asking for duals.
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the duals associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the duals of infinite constraints are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the constraint has multiple infinite parameter dependencies.
It may also be helpful to query via optimizer_model_constraint
to retrieve the constraint(s) that these duals are based on. Calling parameter_refs
and supports
may also be insightful. Be sure to use the same keyword arguments for consistency.
For extensions, this only works if optimizer_model_constraint
has been extended correctly and/or map_dual
has been extended for constraints.
Example
julia> dual(c1)
4-element Array{Float64,1}:
-42.0
-42.0
32.3
0.0
JuMP.shadow_price
— MethodJuMP.shadow_price(cref::InfOptConstraintRef;
[label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...])
Extend JuMP.shadow_price
to return the shadow price(s) of cref
in accordance with its reformulation constraint(s) stored in the optimizer model. Use JuMP.has_duals
to check if a result exists before asking for the shadow price (it uses the duals).
The keyword arugments label
and ndarray
are what TranscriptionOpt
employ and kwargs
denote extra ones that user extensions may employ.
By default only the shadow prices associated with public supports are returned, the full set can be accessed via label = All
. Moreover, the prices of infinite constraints are returned as a list. However, a n-dimensional array can be obtained via ndarray = true
which is handy when the constraint has multiple infinite parameter dependencies.
It may also be helpful to query via optimizer_model_constraint
to retrieve the constraint(s) that these shadow prices are based on. Calling parameter_refs
and supports
may also be insightful. Be sure to use the same keyword arguments for consistency.
For extensions, this only works if optimizer_model_constraint
has been extended correctly and/or map_dual
has been extended for constraints.
Example
julia> shadow_price(c1)
4-element Array{Float64,1}:
42.0
42.0
-32.3
-0.0
InfiniteOpt.map_dual
— Functionmap_dual(cref::InfOptConstraintRef, key::Val{ext_key_name}, result::Int;
kwargs...)
Map the dual(s) of cref
to its counterpart in the optimizer model type that is distininguished by its extension key key
as type Val{ext_key_name}
. Here ref
need refer to methods for both variable references and constraint references. This only needs to be defined for reformulation extensions that cannot readily extend optimizer_model_variable
and optimizer_model_constraint
. Such as is the case with reformuations that do not have a direct mapping between variables and/or constraints in the original infinite form. Otherwise, optimizer_model_variable
and optimizer_model_constraint
are used to make these mappings by default where kwargs
are also pass on to. Here result
is the result index that is used in dual
.
Expressions
JuMP.value
— Methodvalue(v::GenericVariableRef; result = 1)
Return the value of variable v
associated with result index result
of the most-recent returned by the solver.
Use has_values
to check if a result exists before asking for values.
See also: result_count
.
value(var_value::Function, ex::GenericAffExpr)
Evaluate ex
using var_value(v)
as the value for each variable v
.
value(var_value::Function, ex::GenericQuadExpr)
Evaluate ex
using var_value(v)
as the value for each variable v
.
LP Sensitivity
JuMP.lp_sensitivity_report
— MethodJuMP.lp_sensitivity_report(model::InfiniteModel;
[atol::Float64 = 1e-8])::InfOptSensitivityReport
Extends JuMP.lp_sensitivity_report
to generate and return an LP sensitivity report in accordance with the optimizer model. See InfOptSensitivityReport
for syntax details on how to query it. atol
denotes the optimality tolerance and should match that used by the solver to compute the basis. Please refer to JuMP
's documentation for more technical information on interpretting the output of the report.
Example
julia> report = lp_sensitivity_report(model);
julia> report[x]
(0.0, 0.5)
InfiniteOpt.InfOptSensitivityReport
— TypeInfOptSensitivityReport
A wrapper DataType
for JuMP.SensitivityReport
s in InfiniteOpt
. These are generated based on the optimizer model and should be made via the use of lp_sensitivity_report
. Once made these can be indexed to get the sensitivies with respect to variables and/or constraints. The indexing syntax for these is:
report[ref::[GeneralVariableRef/InfOptConstraintRef];
[label::Type{<:AbstractSupportLabel} = PublicLabel,
ndarray::Bool = false, kwargs...]]
This is enabled in user-defined optimizer model extensions by appropriately extending optimizer_model_variable
and optimizer_model_constraint
.
Fields
opt_report::JuMP.SensitivityReport
: The LP sensitivity captured from the optimizer model.