Linear Programming
Overview
Linear programming (LP) is a mathematical optimization technique for maximizing or minimizing a linear objective function subject to linear equality and inequality constraints. First formalized by George Dantzig in 1947, LP has become one of the most widely applied mathematical methods in operations research, economics, engineering, and management science. Its power lies in a remarkable property: the feasible region defined by linear constraints forms a convex polyhedron, and the optimal solution always occurs at a vertex of this polyhedron. This geometric insight enables highly efficient algorithms that solve problems with thousands or even millions of variables.
The standard form of a linear program is:
\begin{aligned} \text{minimize} \quad & c^T x \\ \text{subject to} \quad & Ax \leq b \\ & x \geq 0 \end{aligned}
where x \in \mathbb{R}^n is the vector of decision variables, c \in \mathbb{R}^n defines the objective coefficients, A \in \mathbb{R}^{m \times n} is the constraint matrix, and b \in \mathbb{R}^m specifies the constraint bounds. Any LP can be transformed to this standard form through algebraic manipulation.
Linear Programming (LP)
Classical LP applications are ubiquitous across industries. In production planning, LP determines the optimal mix of products to manufacture given resource constraints (labor, materials, machine time). In transportation and logistics, the transportation problem allocates shipments from warehouses to customers to minimize total shipping cost. The diet problem finds the least expensive combination of foods that meets nutritional requirements. Portfolio optimization in finance, when using the Markowitz mean-variance framework with linear constraints, becomes an LP or QP problem.
LP algorithms fall into two main categories:
Simplex method: Introduced by Dantzig, the simplex method walks along the edges of the feasible polyhedron from vertex to vertex, improving the objective at each step until reaching optimality. While it has exponential worst-case complexity, it performs exceptionally well in practice and is the default in many solvers. The revised simplex method uses matrix factorization for computational efficiency.
Interior-point methods: These algorithms traverse the interior of the feasible region rather than its boundary, achieving polynomial-time complexity guarantees. Karmarkar’s algorithm (1984) was a breakthrough, and modern primal-dual interior-point methods are particularly effective for large-scale problems. Interior-point methods often outperform simplex on massive LPs but provide less natural sensitivity analysis.
The LINEAR_PROG function provides access to both algorithm families via SciPy’s linprog, including the HiGHS dual simplex solver (default and recommended), interior-point, revised simplex, and simplex methods.
Mixed-Integer Linear Programming (MILP)
Mixed-integer linear programming (MILP) extends LP by requiring some or all decision variables to take integer values. This seemingly small modification transforms the problem from efficiently solvable to NP-hard—but it also dramatically expands the modeling power. MILP can encode binary decisions (build or don’t build a factory), logical constraints (if condition A, then condition B), fixed costs (purchasing equipment incurs a setup cost), and combinatorial structure (scheduling, routing, assignment).
Applications include:
- Scheduling: Assigning jobs to machines, crew scheduling for airlines, project planning with precedence constraints
- Network design: Selecting which facilities to open, designing telecommunications networks, supply chain configuration
- Capital budgeting: Choosing which projects to fund given budget and resource constraints
- Vehicle routing: The traveling salesman problem and its variants
- Cutting stock and bin packing: Minimizing waste when cutting raw materials or packing items into containers
MILP solvers use sophisticated branch-and-bound algorithms that partition the solution space into smaller subproblems, solving LP relaxations at each node. Modern enhancements include:
- Cutting planes: Adding constraints that eliminate fractional solutions without removing integer-feasible points
- Branch-and-cut: Combining branch-and-bound with cutting planes
- Preprocessing and heuristics: Tightening bounds, fixing variables, and finding good feasible solutions quickly
The MILP function uses SciPy’s milp solver, which is built on the HiGHS optimization suite. HiGHS implements state-of-the-art branch-and-cut with presolve, dual simplex, and parallel processing.
Quadratic Programming (QP)
Quadratic programming minimizes a quadratic objective function subject to linear constraints:
\begin{aligned} \text{minimize} \quad & \frac{1}{2} x^T Q x + c^T x \\ \text{subject to} \quad & Ax \leq b \\ & A_{eq} x = b_{eq} \end{aligned}
where Q \in \mathbb{R}^{n \times n} is the Hessian matrix of the objective. When Q is positive semidefinite, the problem is convex and can be solved efficiently using variants of interior-point or active-set methods.
QP arises naturally in:
- Portfolio optimization: The Markowitz model minimizes portfolio variance x^T \Sigma x (where \Sigma is the covariance matrix) while meeting return targets and budget constraints
- Support vector machines (SVM): Training linear SVMs requires solving a QP where the dual variables correspond to support vectors
- Model predictive control (MPC): Control systems solve a QP at each time step to minimize tracking error and control effort over a finite horizon
- Least-squares with constraints: Fitting models with inequality or equality constraints (e.g., coefficients must sum to one, or be non-negative)
The CA_QUAD_PROG function solves QP problems using CasADi’s qpsol interface, which wraps high-performance solvers like OSQP, qpOASES, and CPLEX. CasADi’s automatic differentiation framework makes it particularly convenient when the QP is embedded within a larger optimization or control problem.
Duality and Sensitivity Analysis
A fundamental concept in LP is duality. Every LP (the primal) has an associated dual LP whose optimal value equals the primal’s. The dual variables (called shadow prices or Lagrange multipliers) quantify the marginal value of relaxing each constraint by one unit. This provides critical economic interpretation: in production planning, shadow prices reveal the value of acquiring additional resources.
Sensitivity analysis examines how the optimal solution changes as parameters vary—for instance, how much can a cost coefficient change before the optimal solution changes? Modern LP solvers provide sensitivity reports automatically, a feature less mature in MILP (where the discrete structure makes sensitivity analysis more complex).
Computational Considerations
Problem scale: Modern LP solvers routinely handle problems with millions of variables and constraints. MILP is more limited due to NP-hardness, but instances with thousands of integer variables are often tractable.
Sparsity: Real-world constraint matrices are typically sparse (mostly zeros). Efficient LP implementations exploit this using sparse matrix data structures and factorization techniques, dramatically reducing memory and computation.
Preprocessing: Solvers perform extensive preprocessing to simplify problems before solving—removing redundant constraints, fixing variables, tightening bounds. This can reduce problem size by orders of magnitude.
Numerical stability: Poorly scaled problems (e.g., coefficients ranging from 10^{-8} to 10^8) can cause numerical difficulties. Good modeling practice keeps coefficients in a moderate range, typically between 10^{-3} and 10^3.
Native Excel capabilities
Excel’s Solver Add-in provides LP and MILP solving using the Simplex LP algorithm for linear problems and a branch-and-bound method for integer constraints. It is adequate for small problems (typically up to 200 variables) and offers a user-friendly interface with automatic sensitivity reports.
However, Excel Solver has significant limitations:
- Scale: The standard version is limited to 200 decision variables, far below industrial problem sizes
- Algorithms: It uses older simplex implementations without the performance optimizations of modern solvers like HiGHS or Gurobi
- Robustness: It can struggle with numerical instability or fail to converge on difficult MILP instances
- QP support: Native Solver does not directly support quadratic programming (though the GRG Nonlinear engine can handle it as a nonlinear problem, less efficiently)
For educational use or quick prototypes, Excel Solver is valuable. For production systems, research, or problems exceeding a few hundred variables, Python-based solvers offer orders-of-magnitude better performance and reliability.
Third-party Excel add-ins
Frontline Solvers (Analytic Solver) dramatically extends Excel’s optimization capabilities. The premium versions increase limits to millions of variables, add advanced algorithms for stochastic and robust optimization, and include global optimization for nonconvex MILP. Frontline’s Premium Solver Platform integrates commercial-grade solvers like Gurobi, CPLEX, and Xpress, making them accessible within Excel. This is the industry standard for sophisticated optimization in finance, energy, and logistics.
OpenSolver is an open-source add-in that brings COIN-OR solvers to Excel, particularly CBC (branch-and-cut for MILP) and Bonmin (mixed-integer nonlinear). It removes Excel Solver’s variable limits and provides competitive performance for LP and MILP at no cost. OpenSolver is widely used in academia and by organizations seeking industrial-strength optimization without licensing fees.
SolverStudio goes further by embedding Python, Julia, or other languages directly in Excel, allowing users to formulate optimization models using powerful modeling languages like PuLP, Pyomo, or JuMP. This bridges the gap between Excel’s familiarity and the flexibility of modern optimization ecosystems.
Tools
| Tool | Description |
|---|---|
| CA_QUAD_PROG | Solve a quadratic programming problem using CasADi’s qpsol solver. |
| LINEAR_PROG | Solve a linear programming problem using SciPy’s linprog function. |
| MILP | Solve a mixed-integer linear program using scipy.optimize.milp. |