The Excel and Quattro-Pro Solver

The same solver is used by both Excel and Quattro-Pro. It was developed by Frontline Systems, Incline Village, NV. Frontline has a web page at www.frontsys.com, and a nice tutorial on optimization at www.frontsys.com/tutorial.htm. Solver will perform constrained optimization, as well as find solutions to sets of equations.

Solver is iterative. It stops when it reaches convergence or the iteration limit is reached. Convergence is defined as satisfaction of the KKT conditions, or lack of change in the objective function from iteration to iteration.

Solver can handle up to 200 variables with upper and lower bounds, and 100 constraints. An enhanced version of Solver is available to handle larger problems.

The search starts from your initial vector.

Solver uses two different methods for non-linear optimization problems: GRG (Generalized Reduced Gradient, a gradient method) and BFGS (Broyden, Fletcher, Goldfarb, Shanno, a quasi-newton method). For linear problems, the simplex method is used. An separately available add-on give the capability to do sequential quadratic programming, useful for strictly quadratic problems (usually encountered in finance)..


GRG

Inequality constraints are converted to equality constraints through the use of slack variables.

Variables are divided into basic and non-basic (independent and dependent)

The algorithm uses the binding constraints and objective function to define a search direction, then a one dimensional search is performed.

If any non-binding constraints are violated, the solution point is returned to feasibility via a "constraint surface."


BFGS

The constrained problem is converted to an unconstrained problem.

An approximation to the Hessian matrix is used in Newtons method to generate a search direction. The Hessian is updated at each iteration.

Once the search direction is determined, a one dimensional search is performed.

Solver automatically switches from the default GRG to BFGS if enough memory is available to form and invert the Hessian matrix.


Solver Options

Precision: controls constraint satisfaction. The default is 10-6.

Tolerance: Integer tolerance on problems where integer variables are specified.

Assume Linear Model: For strictly linear problems. Uses the Simplex method.

Show Iteration Results: pauses and reports at each iteration.

Use Automatic Scaling: normalizes the variables based on the initial values. Used when variables are of different orders of magnitude. The initial values need to be representative of the magnitudes of the final values.

Estimates: initial estimates of the basic variables in each one dimensional search. Quadratic is computationally more complex, but performs better for highly non-linear problems.

Tangent: Linear extrapolation from a tangent vector.

Quadratic: quadratic extrapolation.

Derivatives: estimates of partial derivatives for the GRG or Hessian.

Forward: forward difference. Quicker.

Central: central difference. Computationally more complex but also more robust.

Search: computation of search direction. As it turns out, Solver automatically overrides your selection based on memory.

Quasi-Newton: BFGS.

Conjugate Gradient: GRG.


Reports

Solver generates three types of reports:

Answer:

Provides initial and final values for f(x)

Provides initial and final values for x.

Provides information on constraint activity and slackness.

Sensitivity:

For linear models: this report shows the gradient and Lagrange multipliers.

For non-linear models:

For variables:

Reduced Cost: the increase in objective per unit increase in variable.

Objective Coefficient: coefficient of the objective function for the cell.

Allowable increase or decrease in coefficient before the variables would change.

For constraints:

Shadow Price: increase in objective per unit increase in the RHS of the constraint equation.

Constraint RHS: value of the RHS of the constraint.

Allowable increase and decrease in the RHS of the constraint before there would be a change in the optimal values of the variables.

Limits:

Lists the objective function and variables. For each variable, this report lists the lower and upper limits of feasibility per the constraints, and the resultant objective function value.


Notes:

Be careful of variable and initial value scaling.

Start from different initial conditions.

A larger precision value will speed computation.

Use the linear model on linear problems.

Avoid or minimize integer variables, and set the integer tolerance high to speed computation.

The methods will fail if the objective function is not twice continuously differentiable.