public final class ActiveSetSequentialQuadraticProgrammingOptimizer extends SequentialQuadraticProgrammingOptimizer
OptimizerMultivariableFunctions
.
This uses an active set method for handling inequality constraints and a quasi-Newton method for
solving quadratic subproblems. There are no special provisions for reducing memory footprint or
handling sparse Hessian matrices so this class is not suitable for handing large, sparse optimization problems.Modifier and Type | Class and Description |
---|---|
static class |
ActiveSetSequentialQuadraticProgrammingOptimizer.ConvergenceChecker
A function that tests the convergence of the optimizer by comparing the results of the current
iteration with the results of the previous iteration.
|
Constructor and Description |
---|
ActiveSetSequentialQuadraticProgrammingOptimizer()
Initializes a new instance.
|
Modifier and Type | Method and Description |
---|---|
Object |
clone(CopyContext context)
Clones this object using the specified context.
|
MultivariableFunctionSolverStepResult<OptimizerMultivariableFunctionResults,OptimizerMultivariableFunctionDerivativeResults> |
computeNextStep(double[] variableValues,
Matrix hessian,
ParameterOptimizerIterationResults previousIterationResults,
ITrackCalculationProgress progressTracker)
Computes the next optimization step that this parameter optimizer should take.
|
boolean |
convergenceCheck(OptimizerMultivariableFunctionResults currentResults,
OptimizerMultivariableFunctionResults previousResults)
Determines whether the optimizer converged or not.
|
ActiveSetSequentialQuadraticProgrammingOptimizer.ConvergenceChecker |
getConvergenceFunction()
Gets or sets a customizable convergence function that is used to determine if the optimizer
converges on a given iteration.
|
boolean |
getIsThreadSafe()
Gets a value indicating whether the methods on this instance are safe to call from
multiple threads simultaneously.
|
LineSearchSettings |
getLineSearchSettings()
Gets an optional property that can be used to specify tolerances, convergence criteria, and maximum iterations
for a line search that uses
GoldenSectionFindExtremum to find the optimal feasible step in the same
direction as the computed step. |
boolean |
getMultithreaded()
Gets a value indicating whether this optimizer should evaluate an iteration with as many threads as the current threading
policy facet will allow, or with as many
Variables (get )
as there are in the function plus 1; whichever is less. |
void |
setConvergenceFunction(ActiveSetSequentialQuadraticProgrammingOptimizer.ConvergenceChecker value)
Gets or sets a customizable convergence function that is used to determine if the optimizer
converges on a given iteration.
|
void |
setLineSearchSettings(LineSearchSettings value)
Sets an optional property that can be used to specify tolerances, convergence criteria, and maximum iterations
for a line search that uses
GoldenSectionFindExtremum to find the optimal feasible step in the same
direction as the computed step. |
void |
setMultithreaded(boolean value)
Sets a value indicating whether this optimizer should evaluate an iteration with as many threads as the current threading
policy facet will allow, or with as many
Variables (get )
as there are in the function plus 1; whichever is less. |
static void |
solveLagrangianDerivativeEquation(OptimizerMultivariableFunctionResults unperturbedAnswer,
OptimizerMultivariableFunctionDerivativeResults derivativeResults,
ArrayList<InequalityConstraintSettings> activeInequalitySet,
ArrayList<Double> activeInequalityErrors,
ArrayList<double[]> activeInequalityGradients,
List<SolverConstraintSettings> equalitySet,
double[] equalityErrors,
ArrayList<Double>[] lagrangeMultipliers,
double[][] lagrangianDerivatives)
Solves for the Lagrange multipliers and derivatives of the Lagrangian
using the cost function, equality errors, active inequality errors,
and the gradients of each with respect to the variables.
|
static Matrix |
updateHessian(Matrix hessian,
double[] changeInLagrangianDerivatives,
double[] previousUnscaledStep)
Uses a damped Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton update
to update the Hessian matrix.
|
findSolution
checkCostFunctionConvergence, checkEqualitySatisfaction, checkInequalitySatisfaction, checkVariableConvergence, dispose, dispose, findSolution, findSolution, getCostFunction, getCurrentIteration, getEqualities, getFunction, getInequalities, getLastRunsResults, getVariables, reset, setCostFunction, setCurrentIteration, setFunction, setLastRunsResults
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
close
public ActiveSetSequentialQuadraticProgrammingOptimizer()
variable
to the
Variables
(get
) as well as set the Function
(get
/ set
).
Further, either the CostFunction
(get
/ set
) must be set or at least one
equality constraint
must be added to the
Equalities
(get
) to pose a well-defined optimization problem.public Object clone(CopyContext context)
This method should be implemented to call a copy constructor on the class of the
object being cloned. The copy constructor should take the CopyContext
as a parameter
in addition to the existing instance to copy. The copy constructor should first call
CopyContext.addObjectMapping(T, T)
to identify the newly constructed instance
as a copy of the existing instance. It should then copy all fields, using
CopyContext.updateReference(T)
to copy any reference fields.
A typical implementation of ICloneWithContext
:
public static class MyClass implements ICloneWithContext {
public MyClass(MyClass existingInstance, CopyContext context) {
context.addObjectMapping(existingInstance, this);
someReference = context.updateReference(existingInstance.someReference);
}
@Override
public final Object clone(CopyContext context) {
return new MyClass(this, context);
}
private Object someReference;
}
In general, all fields that are reference types should be copied with a call to
CopyContext.updateReference(T)
. There are a couple of exceptions:
If one of these exceptions applies, the CopyContext
should be given an opportunity
to update the reference before the reference is copied explicitly. Use
CopyContext.updateReference(T)
to update the reference. If CopyContext.updateReference(T)
returns
the original object, indicating that the context does not have a replacement registered,
then copy the object manually by invoking a Clone method, a copy constructor, or by manually
constructing a new instance and copying the values.
alwaysCopy = context.updateReference(existingInstance.alwaysCopy);
if (existingInstance.alwaysCopy != null && alwaysCopy == existingInstance.alwaysCopy) {
alwaysCopy = (AlwaysCopy) existingInstance.alwaysCopy.clone(context);
}
If you are implementing an evaluator (a class that implements IEvaluator
), the
IEvaluator.updateEvaluatorReferences(agi.foundation.infrastructure.CopyContext)
method shares some responsibilities with the
copy context constructor. Code duplication can be avoided by doing the following:
CopyContext.updateReference(T)
. You should still call CopyContext.updateReference(T)
on any references to
non-evaluators.
IEvaluator.updateEvaluatorReferences(agi.foundation.infrastructure.CopyContext)
as the last line in the constructor and pass it the
same CopyContext
passed to the constructor.
IEvaluator.updateEvaluatorReferences(agi.foundation.infrastructure.CopyContext)
as normal. See the reference documentation for
IEvaluator.updateEvaluatorReferences(agi.foundation.infrastructure.CopyContext)
for more information on implementing that method.
public MyClass(MyClass existingInstance, CopyContext context) {
super(existingInstance, context);
someReference = context.updateReference(existingInstance.someReference);
evaluatorReference = existingInstance.evaluatorReference;
updateEvaluatorReferences(context);
}
@Override
public void updateEvaluatorReferences(CopyContext context) {
evaluatorReference = context.updateReference(evaluatorReference);
}
@Override
public Object clone(CopyContext context) {
return new MyClass(this, context);
}
private Object someReference;
private IEvaluator evaluatorReference;
clone
in interface ICloneWithContext
clone
in class ParameterOptimizer
context
- The context to use to perform the copy.public boolean getIsThreadSafe()
If this property is true
, all methods and properties are guaranteed to be thread safe.
Conceptually, an object that returns true
for this method acts as if there is a lock
protecting each method and property such that only one thread at a time can be inside any method or
property in the class. In reality, such locks are generally not used and are in fact discouraged. However,
the user must not experience any exceptions or inconsistent behavior that would not be experienced if such
locks were used.
If this property is false
, the behavior when using this class from multiple threads
simultaneously is undefined and may include inconsistent results and exceptions. Clients wishing to use
multiple threads should call CopyForAnotherThread.copy(T)
to get a separate instance of the
object for each thread.
getIsThreadSafe
in interface IThreadAware
getIsThreadSafe
in class ParameterOptimizer
public final LineSearchSettings getLineSearchSettings()
GoldenSectionFindExtremum
to find the optimal feasible step in the same
direction as the computed step.public final void setLineSearchSettings(LineSearchSettings value)
GoldenSectionFindExtremum
to find the optimal feasible step in the same
direction as the computed step.public boolean getMultithreaded()
Variables
(get
)
as there are in the function plus 1; whichever is less. By default this is true
.
There are some situations where the multithreaded algorithm will be slower than the single threaded. Multi-threaded may be slower if the time it takes to compute the function is fast when compared to the overhead of setting up the threads, or if the overall optimizer will converge to a solution in very few iterations and there are many variables relative to the number of threads.
The specific algorithm you implement may fundamentally be single threaded. In that case it is acceptable to ignore this property.
getMultithreaded
in class ParameterOptimizer
public void setMultithreaded(boolean value)
Variables
(get
)
as there are in the function plus 1; whichever is less. By default this is true
.
There are some situations where the multithreaded algorithm will be slower than the single threaded. Multi-threaded may be slower if the time it takes to compute the function is fast when compared to the overhead of setting up the threads, or if the overall optimizer will converge to a solution in very few iterations and there are many variables relative to the number of threads.
The specific algorithm you implement may fundamentally be single threaded. In that case it is acceptable to ignore this property.
setMultithreaded
in class ParameterOptimizer
public final ActiveSetSequentialQuadraticProgrammingOptimizer.ConvergenceChecker getConvergenceFunction()
Gets or sets a customizable convergence function that is used to determine if the optimizer converges on a given iteration.
By default, the cost function change from the previous to the current iteration must be within the
Tolerance
(get
/ set
) of the CostFunction
(get
/ set
),
all of the variable changes from the previous to the current iteration must be within the
VariableTolerance
(get
/ set
) of the Variables
(get
),
all of the equality constraints in Equalities
(get
) must match their
DesiredValue
(get
/ set
) within tolerance,
and all of the inequality constraints in Inequalities
(get
) must be
satisfied.
public final void setConvergenceFunction(ActiveSetSequentialQuadraticProgrammingOptimizer.ConvergenceChecker value)
Gets or sets a customizable convergence function that is used to determine if the optimizer converges on a given iteration.
By default, the cost function change from the previous to the current iteration must be within the
Tolerance
(get
/ set
) of the CostFunction
(get
/ set
),
all of the variable changes from the previous to the current iteration must be within the
VariableTolerance
(get
/ set
) of the Variables
(get
),
all of the equality constraints in Equalities
(get
) must match their
DesiredValue
(get
/ set
) within tolerance,
and all of the inequality constraints in Inequalities
(get
) must be
satisfied.
public boolean convergenceCheck(OptimizerMultivariableFunctionResults currentResults, OptimizerMultivariableFunctionResults previousResults)
This method calls the ConvergenceFunction
(get
/ set
) with this
,
currentResults
, and previousResults
as its
arguments.
convergenceCheck
in class SequentialQuadraticProgrammingOptimizer
currentResults
- The current results that are used to check if the equality constraints match
their desired values within tolerance and the inequality constraints are satisfied.
These are also used in combination with the previous results to check changes in the independent variables and
the cost function. Use of these results in custom convergence checks is also permitted.previousResults
- The previous results that are used in combination with the current results
to check changes in the independent variables and the cost function. Use of these results in custom
convergence checks is also permitted.true
if the convergence check passes; false
if the convergence
check fails.@Nonnull public MultivariableFunctionSolverStepResult<OptimizerMultivariableFunctionResults,OptimizerMultivariableFunctionDerivativeResults> computeNextStep(@Nonnull double[] variableValues, Matrix hessian, ParameterOptimizerIterationResults previousIterationResults, ITrackCalculationProgress progressTracker)
computeNextStep
in class SequentialQuadraticProgrammingOptimizer
variableValues
- The current values of the variables.hessian
- An approximation of the second derivative
matrix of the Lagrangian.previousIterationResults
- The results of the previous iteration
that are used to compute the next step.progressTracker
- An optional progress tracker.public static void solveLagrangianDerivativeEquation(@Nonnull OptimizerMultivariableFunctionResults unperturbedAnswer, @Nonnull OptimizerMultivariableFunctionDerivativeResults derivativeResults, @Nonnull ArrayList<InequalityConstraintSettings> activeInequalitySet, ArrayList<Double> activeInequalityErrors, ArrayList<double[]> activeInequalityGradients, List<SolverConstraintSettings> equalitySet, @Nonnull double[] equalityErrors, @Nonnull ArrayList<Double>[] lagrangeMultipliers, @Nonnull double[][] lagrangianDerivatives)
unperturbedAnswer
- The nominal value of the OptimizerMultivariableFunction
.derivativeResults
- The derivatives of the OptimizerMultivariableFunction
that are used to provide the gradients of the cost function and the equality constraints.activeInequalitySet
- A list of inequalities that are currently in the active set.activeInequalityErrors
- The differences between the current values and the bound
values for each inequality currently in the active set.activeInequalityGradients
- The gradients of the inequalities in the active set.equalitySet
- A list of equalities that are always in the active set.equalityErrors
- The differences between the current values and the desired values
for each equality.lagrangeMultipliers
- A list of lagrange multipliers for each constraint in the active set.lagrangianDerivatives
- The derivatives of the Lagrangian with respect to
each of the variables.public static Matrix updateHessian(Matrix hessian, @Nonnull double[] changeInLagrangianDerivatives, @Nonnull double[] previousUnscaledStep)
The damped BFGS algorithm used to update the Hessian is detailed on pp. 536-537 of the second edition of "Numerical Optimization" by J. Nocedal and S.J. Wright, published by Springer in 2006.
hessian
- The Hessian matrix before the update.changeInLagrangianDerivatives
- The difference between the current Lagrangian
derivatives and those of the previous iteration.previousUnscaledStep
- The unscaled previous step.