SHGO

Overview

The SHGO function finds the global minimum of an objective function using Simplicial Homology Global Optimization, a deterministic global optimization algorithm particularly well-suited for low-dimensional problems. Unlike heuristic methods such as genetic algorithms or simulated annealing, SHGO provides theoretical guarantees of convergence to the global minimum in finite time when the default simplicial sampling method is used.

This implementation uses the scipy.optimize.shgo function from SciPy. The algorithm was introduced by Endres, Sandrock, and Focke in their 2018 paper “A simplicial homology algorithm for Lipschitz optimisation” published in the Journal of Global Optimization.

SHGO works by constructing a simplicial complex—a geometric structure built from points, line segments, triangles, and their higher-dimensional analogues—to partition the search space. The algorithm uses concepts from algebraic topology (specifically, homology groups) to identify locally convex subdomains of the objective function, each potentially containing a unique local minimum. By analyzing the topological structure of the function landscape, SHGO can systematically locate all local minima and determine which is globally optimal.

The algorithm supports three sampling methods:

  • Simplicial (default): Provides guaranteed convergence to the global minimum; generates 2^{dim} + 1 sampling points by default
  • Sobol: Uses Sobol’ sequences for quasi-random sampling; faster but without convergence guarantees
  • Halton: Uses Halton sequences for low-discrepancy sampling; also faster but without guaranteed convergence

The general optimization problem solved by SHGO is:

\min_{x \in \Omega} f(x) \quad \text{subject to} \quad g_i(x) \geq 0, \; h_j(x) = 0

where f(x) is the objective function, g_i(x) are inequality constraints, and h_j(x) are equality constraints. While theoretical guarantees apply to Lipschitz smooth functions, SHGO also converges for non-continuous, non-convex, and non-smooth functions when using simplicial sampling.

This example function is provided as-is without any representation of accuracy.

Excel Usage

=SHGO(func_expr, bounds, n, iters, sampling_method)
  • func_expr (str, required): Objective function expression using x[i] notation for variables.
  • bounds (list[list], required): 2D list of [min, max] pairs defining variable bounds.
  • n (int, optional, default: 100): Number of initial sampling points.
  • iters (int, optional, default: 1): Number of refinement iterations.
  • sampling_method (str, optional, default: “simplicial”): Sampling strategy for the optimization.

Returns (list[list]): 2D list [[x1, x2, …, objective]], or error message string.

Examples

Example 1: Demo case 1

Inputs:

func_expr bounds n iters sampling_method
(x[0]-1)2 + (x[1]+2)2 -5 5 120 3 simplicial
-5 5

Excel formula:

=SHGO("(x[0]-1)**2 + (x[1]+2)**2", {-5,5;-5,5}, 120, 3, "simplicial")

Expected output:

Result
1 -2 0

Example 2: Demo case 2

Inputs:

func_expr bounds sampling_method n iters
(x[0]-2)^2 + (x[1]-1)^2 -6 6 simplicial 50 2
-6 6

Excel formula:

=SHGO("(x[0]-2)^2 + (x[1]-1)^2", {-6,6;-6,6}, "simplicial", 50, 2)

Expected output:

Result
2 1 0

Example 3: Demo case 3

Inputs:

func_expr bounds n sampling_method iters
(x[0]-1)2 + (x[1]-0.5)2 + (x[2]+1)**2 -4 4 120 simplicial 3
-4 4
-4 4

Excel formula:

=SHGO("(x[0]-1)**2 + (x[1]-0.5)**2 + (x[2]+1)**2", {-4,4;-4,4;-4,4}, 120, "simplicial", 3)

Expected output:

Result
1 0.5 -1 0

Example 4: Demo case 4

Inputs:

func_expr bounds sampling_method n iters
sin(x[0]) + cos(x[1]) -3.14159 3.14159 simplicial 60 2
-3.14159 3.14159

Excel formula:

=SHGO("sin(x[0]) + cos(x[1])", {-3.14159,3.14159;-3.14159,3.14159}, "simplicial", 60, 2)

Expected output:

Result
-1.5708 -3.1416 -2

Python Code

import math
import re

import numpy as np
from scipy.optimize import shgo as scipy_shgo

def shgo(func_expr, bounds, n=100, iters=1, sampling_method='simplicial'):
    """
    Find global minimum using Simplicial Homology Global Optimization.

    See: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.shgo.html

    This example function is provided as-is without any representation of accuracy.

    Args:
        func_expr (str): Objective function expression using x[i] notation for variables.
        bounds (list[list]): 2D list of [min, max] pairs defining variable bounds.
        n (int, optional): Number of initial sampling points. Default is 100.
        iters (int, optional): Number of refinement iterations. Default is 1.
        sampling_method (str, optional): Sampling strategy for the optimization. Valid options: Simplicial, Sobol, Halton. Default is 'simplicial'.

    Returns:
        list[list]: 2D list [[x1, x2, ..., objective]], or error message string.
    """
    def to2d(x):
        return [[x]] if not isinstance(x, list) else x

    if not isinstance(func_expr, str):
        return "Invalid input: func_expr must be a string."
    if func_expr.strip() == "":
        return "Invalid input: func_expr must be a non-empty string."

    func_expr = re.sub(r'\^', '**', func_expr)

    if not re.search(r'\bx\b', func_expr):
        return "Invalid input: func_expr must reference variable x (e.g., x[0])."

    bounds = to2d(bounds)

    if not isinstance(bounds, list) or len(bounds) == 0:
        return "Invalid input: bounds must be a 2D list of [min, max] pairs."

    processed_bounds = []
    for idx, bound in enumerate(bounds):
        if not isinstance(bound, list) or len(bound) != 2:
            return "Invalid input: each bound must be a [min, max] pair."
        try:
            lower = float(bound[0])
            upper = float(bound[1])
        except (TypeError, ValueError):
            return "Invalid input: bounds must contain numeric values."
        if lower > upper:
            return f"Invalid input: lower bound must not exceed upper bound for variable index {idx}."
        processed_bounds.append((lower, upper))

    dimension = len(processed_bounds)

    try:
        n = int(n)
        iters = int(iters)
    except (TypeError, ValueError):
        return "Invalid input: n and iters must be integers."
    if n <= 0:
        return "Invalid input: n must be positive."
    if iters <= 0:
        return "Invalid input: iters must be positive."

    valid_methods = {'simplicial', 'sobol', 'halton'}
    if sampling_method not in valid_methods:
        return f"Invalid input: sampling_method must be one of: {', '.join(sorted(valid_methods))}"

    safe_globals = {
        "math": math,
        "np": np,
        "numpy": np,
        "__builtins__": {},
    }
    safe_globals.update({
        name: getattr(math, name)
        for name in dir(math)
        if not name.startswith("_")
    })
    safe_globals.update({
        "sin": np.sin,
        "cos": np.cos,
        "tan": np.tan,
        "asin": np.arcsin,
        "arcsin": np.arcsin,
        "acos": np.arccos,
        "arccos": np.arccos,
        "atan": np.arctan,
        "arctan": np.arctan,
        "sinh": np.sinh,
        "cosh": np.cosh,
        "tanh": np.tanh,
        "exp": np.exp,
        "log": np.log,
        "ln": np.log,
        "log10": np.log10,
        "sqrt": np.sqrt,
        "abs": np.abs,
        "pow": np.power,
        "pi": math.pi,
        "e": math.e,
    })

    x0_vector = [np.mean([b[0], b[1]]) for b in processed_bounds]

    try:
        initial_eval = eval(func_expr, safe_globals, {"x": x0_vector})
        float(initial_eval)
    except Exception as exc:
        return f"Error: Invalid expression at initial guess: {exc}"

    def objective(x_vector):
        try:
            local_x = [float(val) for val in np.atleast_1d(x_vector)]
            result = eval(func_expr, safe_globals, {"x": local_x})
            numeric_value = float(result)
        except Exception as exc:
            raise ValueError(f"Error evaluating func_expr: {exc}")
        if not math.isfinite(numeric_value):
            raise ValueError("Objective evaluation produced NaN or infinity.")
        return numeric_value

    try:
        result = scipy_shgo(
            objective,
            bounds=processed_bounds,
            n=n,
            iters=iters,
            sampling_method=sampling_method,
        )
    except Exception as exc:
        return f"Error during SHGO: {exc}"

    if not result.success:
        return f"Error during SHGO: {result.message}"

    if result.x is None or result.fun is None:
        return "Error during SHGO: missing solution data."

    try:
        solution_vector = [float(val) for val in np.atleast_1d(result.x)]
    except (TypeError, ValueError):
        return "Error during SHGO: could not convert solution to floats."

    objective_value = float(result.fun)
    return [solution_vector + [objective_value]]

Online Calculator