Global Optimization

Overview

Global optimization seeks the absolute minimum (or maximum) of an objective function over its entire feasible domain, even when the landscape contains multiple local optima, plateaus, and discontinuities. Unlike local optimization methods that converge to the nearest stationary point, global optimizers must systematically explore or intelligently sample the search space to identify the true global optimum. This challenge appears across diverse fields: molecular modeling (protein folding), machine learning (hyperparameter tuning), engineering design (structural optimization), and financial modeling (portfolio optimization with constraints).

The fundamental difficulty is that verifying a solution is globally optimal is generally intractable for non-convex problems. Most practical global optimization algorithms balance exploration (searching new regions) with exploitation (refining promising candidates) and can only guarantee convergence under specific smoothness or sampling conditions.

Stochastic Global Optimization

Stochastic optimization methods introduce randomness to escape local minima and explore the search space probabilistically. These methods are particularly effective for black-box functions where derivatives are unavailable or unreliable.

Differential Evolution maintains a population of candidate solutions and evolves them through mutation and crossover operations. At each generation, it creates trial vectors by combining existing population members, accepting improvements according to a selection rule. The DIFF_EVOLUTION function implements this evolutionary strategy with control over mutation rates, crossover probabilities, and population size.

Simulated Annealing draws inspiration from metallurgical annealing processes. It probabilistically accepts worse solutions with a temperature-dependent probability, allowing uphill moves early in the search while gradually “cooling” to converge on a minimum. The DUAL_ANNEALING function combines generalized simulated annealing with periodic local refinement, balancing global exploration through temperature-controlled jumps with deterministic polishing steps.

Basin-Hopping alternates between local minimization and random perturbation. After finding a local minimum, it “hops” to a new random point and repeats, gradually mapping the energy landscape. The BASIN_HOPPING function implements this iterative strategy to escape local basins and discover lower-energy configurations.

Deterministic Global Optimization

Deterministic methods provide systematic coverage of the search space, and in some cases, theoretical guarantees of convergence to the global optimum.

Simplicial Homology Global Optimization (SHGO) uses low-discrepancy sequences (like Sobol sequences) to systematically sample the domain. It constructs a simplicial complex and applies topological methods from Morse theory to identify all local minima and prove convergence to the global minimum under Lipschitz continuity conditions. The SHGO function exposes this powerful algorithm with control over sampling density and refinement iterations.

Brute-Force Grid Search exhaustively evaluates the objective function over a regularly spaced grid. While computationally expensive, it guarantees that the global minimum lies within one grid cell of the best sampled point. The BRUTE function performs this complete enumeration, providing a baseline for comparison and validation when the dimensionality is low.

Choosing an Algorithm

The choice of global optimization method depends on several problem characteristics:

  • Smoothness: If the function is differentiable, DUAL_ANNEALING and BASIN_HOPPING can leverage local gradient-based polishing. For non-smooth or noisy functions, DIFF_EVOLUTION is more robust.
  • Dimension: BRUTE works only for low-dimensional problems (typically 1-4 variables). DIFF_EVOLUTION and DUAL_ANNEALING scale to dozens of dimensions. SHGO is most efficient in moderate dimensions (2-10) where deterministic sampling is practical.
  • Budget: If function evaluations are expensive, SHGO offers efficient coverage. If evaluations are cheap, DIFF_EVOLUTION with large populations can be effective.
  • Guarantees: SHGO provides theoretical convergence guarantees for Lipschitz-continuous functions. Stochastic methods offer probabilistic convergence.

Hybrid Strategies

Modern global optimization often combines methods. Polishing refers to applying a local optimizer (like L-BFGS-B or Nelder-Mead) to refine solutions found by global search. Both DIFF_EVOLUTION and DUAL_ANNEALING support optional polishing to ensure high-precision final solutions.

Multi-start methods run a local optimizer from multiple random initial points, effectively sampling the basin structure. BASIN_HOPPING formalizes this with a Metropolis-like acceptance criterion.

Native Excel Capabilities

Excel’s built-in Solver Add-in includes a GRG Nonlinear solving method and an Evolutionary solver:

  • GRG Nonlinear uses the Generalized Reduced Gradient algorithm, which is a local method that can get trapped in local optima.
  • Evolutionary Solver applies a genetic algorithm to search globally. It is suitable for non-smooth, discontinuous, or discrete problems but is limited to approximately 200 decision variables and can be slow for high-dimensional or expensive objective functions.

Critical Limitations: The Evolutionary Solver lacks advanced features like adaptive mutation, temperature schedules, or deterministic sampling strategies. It does not expose algorithmic parameters (mutation rate, crossover probability) and provides limited control over convergence criteria. For research-grade or large-scale global optimization, Python-based methods like those in SciPy offer superior performance and flexibility.

Third-Party Excel Add-ins

Tools

Tool Description
BASIN_HOPPING Minimize a single-variable expression with SciPy’s basinhopping algorithm.
BRUTE Perform a brute-force grid search to approximate the global minimum of a function.
DIFF_EVOLUTION Minimize a multivariate function using differential evolution.
DUAL_ANNEALING Minimize a multivariate function using dual annealing.
SHGO Find global minimum using Simplicial Homology Global Optimization.