Optimization
Overview
Optimization is the mathematical discipline concerned with finding the “best” solution from a set of feasible alternatives. At its core, an optimization problem consists of three components: an objective function to minimize or maximize, a set of decision variables that can be adjusted, and a collection of constraints that define the feasible region. This framework is remarkably versatile and appears throughout science, engineering, economics, and business operations.
In operations research, optimization drives logistics, supply chain management, and resource allocation. In economics and finance, portfolio optimization balances risk and return while managing constraints on capital and diversification. Engineering applications range from structural design and control systems to machine learning, where training neural networks is fundamentally an optimization problem.
Assignment Problems
Assignment problems are a fundamental class of combinatorial optimization where n tasks must be assigned to n agents in a one-to-one mapping that minimizes total cost. These problems are ubiquitous in scheduling, resource allocation, and matching markets.
Linear assignment problems (LAP) assume the total cost is the sum of individual assignment costs. This structure makes them solvable in polynomial time using the Hungarian algorithm, implemented efficiently in LINEAR_ASSIGNMENT. Applications include job scheduling, facility assignment, and optimal pairing in two-sided markets.
Quadratic assignment problems (QAP) introduce interaction terms between assignments—for example, considering both the flow between facilities and the distance between their assigned locations. QAP is NP-hard and significantly more challenging, but critical for facility layout design, circuit board placement, and keyboard layout optimization. The QUAD_ASSIGN function provides heuristic solutions for these complex problems.
Global Optimization
Global optimization seeks the absolute minimum of a function over its entire domain, even when multiple local optima exist. This is essential when the cost landscape is complex, multimodal, or poorly understood.
Stochastic methods use randomness to escape local minima and explore the solution space broadly. Differential Evolution, implemented in DIFF_EVOLUTION, is a population-based algorithm that mutates and recombines candidate solutions. Dual annealing, available through DUAL_ANNEALING, combines classical simulated annealing with fast local search. Basin hopping, provided by BASIN_HOPPING, repeatedly applies random perturbations followed by local minimization to hop between basins of attraction.
Deterministic methods provide systematic coverage or mathematical guarantees. SHGO uses simplicial homology to identify and explore all local minima in a bounded region. BRUTE performs exhaustive grid search, guaranteeing discovery of the global minimum within the discretization resolution—useful for low-dimensional problems or validating other methods.
Linear Programming and MILP
Linear Programming (LP) handles problems where both the objective function and all constraints are linear. The feasible region forms a convex polyhedron, and optimal solutions always lie at the vertices. This elegant structure enables efficient solution via the Simplex method or Interior-point methods, both available through LINEAR_PROG.
LP is foundational in operations research: production planning, transportation logistics, diet optimization, and workforce scheduling are all naturally expressed as linear programs. The standard form minimizes c^T x subject to Ax \leq b and x \geq 0.
Mixed-Integer Linear Programming (MILP) extends LP by requiring some variables to be integers. This seemingly small change makes the problem NP-hard but vastly increases modeling power. MILP can encode logical constraints, binary decisions, and discrete choices—enabling applications in scheduling, network design, and capital budgeting. The MILP function uses modern branch-and-bound algorithms.
Quadratic Programming (QP) minimizes a quadratic objective subject to linear constraints. This arises naturally in portfolio optimization (minimizing variance), control theory, and machine learning (support vector machines). CA_QUAD_PROG leverages CasADi’s automatic differentiation for efficient QP solving.
Local Optimization
Local optimization seeks a minimum within a neighborhood of the starting point. When the objective is smooth and unimodal, or when a good initial guess is available, local methods are faster and more robust than global alternatives.
Gradient-based methods exploit derivative information to follow the steepest descent. The BFGS algorithm approximates the Hessian to achieve superlinear convergence. Newton-CG uses conjugate gradients to solve the Newton equations efficiently. These and other gradient-based methods are accessible through MINIMIZE.
CasADi provides automatic differentiation, eliminating the need for manual gradient computation while maintaining numerical precision. CA_MINIMIZE leverages this capability for optimization problems defined by complex expressions.
Derivative-free methods do not require gradients and are suitable for noisy, discontinuous, or “black box” objective functions. Nelder-Mead uses a simplex of test points that expands, contracts, and reflects to navigate the solution space. Powell’s method performs sequential line searches along conjugate directions. Both are available through MINIMIZE.
For single-variable problems, MINIMIZE_SCALAR provides specialized algorithms like Brent’s method, which combines golden-section search with parabolic interpolation.
Root Finding
Root-finding solves f(x) = 0. While conceptually distinct from optimization, root-finding is intimately connected: the first-order optimality condition requires that the gradient vanishes, converting optimization to root-finding.
Scalar root-finding locates zeros of single-variable functions. Bracketing methods like bisection guarantee convergence by repeatedly halving an interval known to contain a root. Open methods like Newton-Raphson converge faster when derivatives are available and the initial guess is good. ROOT_SCALAR provides access to these and hybrid methods like Brent’s method.
Multidimensional root-finding solves systems of nonlinear equations F(x) = 0 where F: \mathbb{R}^n \to \mathbb{R}^n. Quasi-Newton methods like Broyden’s method build approximate Jacobians iteratively. Trust-region methods control step size to ensure robust convergence. ROOT implements these approaches using SciPy’s solvers, while CA_ROOT uses CasADi’s automatic differentiation to compute exact Jacobians.
The fixed-point iteration method, available through FIXED_POINT, finds x such that f(x) = x. Any root-finding problem can be recast as fixed-point iteration, and vice versa.
Native Excel capabilities
Excel provides several built-in optimization tools that are accessible but limited for production-scale problems:
Solver Add-in is Excel’s primary optimization tool, supporting linear programming, nonlinear programming, and integer programming. It implements the GRG Nonlinear algorithm for smooth nonlinear problems and Simplex LP for linear models. However, it is typically limited to 200 decision variables and lacks modern algorithms like interior-point methods or advanced MILP branch-and-cut techniques. For small to medium problems, Solver is adequate, but it struggles with large-scale or highly nonlinear models.
Goal Seek is a simplified tool for single-variable root finding. It uses a variant of the secant method to find an input value that produces a target output. While convenient for quick what-if analysis, it does not handle constraints or multiple variables and can fail on poorly behaved functions.
Analysis ToolPak includes regression analysis, which is fundamentally a least-squares optimization problem. However, it is limited to linear regression and lacks capabilities for regularization, robust regression, or nonlinear curve fitting.
Python functions offer significant advantages: they handle thousands of variables, access state-of-the-art algorithms from SciPy Optimize and CasADi, support automatic differentiation, and provide detailed convergence diagnostics. For production optimization workflows, Python-based solutions are substantially more powerful and flexible.
Third-party Excel add-ins
Frontline Solvers (Analytic Solver) is developed by the creators of Excel’s original Solver add-in. The premium versions dramatically increase problem size limits (up to millions of variables), add advanced algorithms for stochastic programming and global optimization, and support robust optimization under uncertainty. Frontline is the industry standard for sophisticated optimization within Excel.
OpenSolver is an open-source add-in that integrates Excel with powerful COIN-OR solvers, including CBC for integer programming and Bonmin for mixed-integer nonlinear programming. It removes the variable limits of native Solver and is particularly strong for large-scale linear and integer programs. OpenSolver is free and widely used in education and research.
Assignment Problems
| Tool | Description |
|---|---|
| LINEAR_ASSIGNMENT | Solve the linear assignment problem using scipy.optimize.linear_sum_assignment. |
| QUAD_ASSIGN | Solve a quadratic assignment problem using SciPy’s implementation. |
Global Optimization
| Tool | Description |
|---|---|
| BASIN_HOPPING | Minimize a single-variable expression with SciPy’s basinhopping algorithm. |
| BRUTE | Perform a brute-force grid search to approximate the global minimum of a function. |
| DIFF_EVOLUTION | Minimize a multivariate function using differential evolution. |
| DUAL_ANNEALING | Minimize a multivariate function using dual annealing. |
| SHGO | Find global minimum using Simplicial Homology Global Optimization. |
Linear Programming
| Tool | Description |
|---|---|
| CA_QUAD_PROG | Solve a quadratic programming problem using CasADi’s qpsol solver. |
| LINEAR_PROG | Solve a linear programming problem using SciPy’s linprog function. |
| MILP | Solve a mixed-integer linear program using scipy.optimize.milp. |
Local Optimization
| Tool | Description |
|---|---|
| CA_MINIMIZE | Minimize a multivariate function using CasADi with automatic differentiation. |
| MINIMIZE | Minimize a multivariate function using SciPy’s minimize routine. |
| MINIMIZE_SCALAR | Minimize a single-variable function using SciPy’s minimize_scalar. |
Root Finding
| Tool | Description |
|---|---|
| CA_ROOT | Solve a system of nonlinear equations using CasADi with automatic Jacobian. |
| FIXED_POINT | Find a fixed point x such that f(x) = x for a scalar function expression. |
| ROOT | Solve a square nonlinear system using SciPy’s root solver. |
| ROOT_SCALAR | Find a real root of a scalar function using SciPy’s root_scalar. |