The Simplex Algorithm, the engine of mathematical optimization
Since its introduction by George B. Dantzig in 1947, although it was published in 1951, (Dantzig, 1951), the Simplex Algorithm has been regarded as one of the most influential advances in scientific computing and strategic decision making. Its importance lies in its ability to solve linear optimization problems, efficiently finding the optimal value (maximum or minimum) of a function subject to multiple resource constraints.
The Simplex Algorithm intelligently navigates the vertices of the feasible region, guaranteeing that if an optimal solution exists, it will be reached in a finite number of steps.
To follow the procedure explained below, it is useful to recall the basic concepts presented in the post The anatomy of a linear optimization problem.
The Explicit System: The Basis of the Algorithm
For the algorithm to evaluate whether a solution is optimal or how to improve it, the original problem must be transformed into an explicit system associated with a basis $\mathbf{B}$. We start from the standard form:
\[\begin{aligned} \min \;\; & z = \mathbf{c}^\intercal \mathbf{x} \\ \text{s. t.:}\;\; & \mathbf{A}\mathbf{x} = \mathbf{b} \\ & \mathbf{x} \geqslant \mathbf{0}. \end{aligned}\]Dividing the variables into basic \(\mathbf{x}_\mathbf{B}\) and non-basic \(\mathbf{x}_{\mathbf{N}}\), and multiplying by the inverse of the basis \(\mathbf{B}^{-1}\), we obtain the fundamental elements:
-
Indices of basic components:
\[\mathcal{I} = \big\{s \in \{1,\ldots,n\}: \mathbf{a}_s \in \mathbf{B} \big\}\] -
Indices of non-basic components:
\[\mathcal{J} = \big\{j \in \{1,\ldots,n\}: \mathbf{a}_j \in \mathbf{N} \big\}\] -
Equations for the basic variables:
\[\mathbf{x}_\mathbf{B} = \bar{\mathbf{x}}_\mathbf{B} - \displaystyle \sum_{j \in \mathcal{J}} \mathbf{y}_j x_j\]where $\bar{\mathbf{x}}_\mathbf{B} = \mathbf{B}^{-1}\mathbf{b}$ is the current basic solution and $\mathbf{y}_j = \mathbf{B}^{-1}\mathbf{a}_j$.
-
Objective Function:
\[z = \bar{z} - \displaystyle \sum_{j \in \mathcal{J}} (z_j - c_j)x_j\]where $\bar{z}$ represents the current value of the objective function and the quantity $(c_j - z_j)$ indicates how much the function increases (or decreases) for each unit increase in the non-basic variable $x_j$.
This leads to the explicit system in its standard form for a basis $\mathbf{B}$:
\[\begin{aligned} \min\;\;\; & \bar{z} - \sum_{j \in \mathcal{J}} (z_j - c_j) x_j \\ \text{s. t.:}\;\;\; & \bar{x}_s - \sum_{j \in \mathcal{J}} y_{sj} x_j \geqslant 0 && \forall s \in \mathcal{I} \\ & x_j \geqslant 0 && \forall j \in \mathcal{J} \end{aligned}\]The Three Simplex Theorems
The behavior and stopping criterion of the algorithm are based on these three essential theorems (posed for a minimization problem):
First Simplex Theorem (Optimality Criterion)
A feasible basic solution is optimal if for every non-basic variable it holds that:
\[z_j - c_j \leqslant 0\]
Second Simplex Theorem (Unbounded Optimal Solution):
If there exists a non-basic variable $x_k$ such that:
\[z_k - c_k > 0\]and all elements of its associated vector are non-positive ($\mathbf{y}_k \leqslant 0$), then the problem has an unbounded optimal solution.
Third Theorem (Condition to Improve the Solution):
If for a feasible basic solution there exists a variable $x_k$ such that:
\[z_k - c_k > 0\]and at least one component of $\mathbf{y}_k$ is positive ($\mathbf{y}_k \nleqslant 0$), there exists a new basis $\mathbf{B}’$ that provides a feasible solution with an objective value less than or equal to the current one.
Iteration Procedure
The Simplex Algorithm works with two rules: entering, selecting the variable to enter the basis, and leaving, selecting the variable to exit the basis.
-
Entering Rule:
The variable $x_k$ enters such that:
\[z_k - c_k = \max\{z_j-c_j>0 : j \in \mathcal{J}\}\]If it is a maximization problem:
\[z_k - c_k = \max\{z_j-c_j<0 : j \in \mathcal{J}\}\] -
Leaving Rule:
The variable $x_l$ leaves such that:
\[\frac{\bar{x}_l}{y_{lk}} \equiv \min \Bigg\{\frac{\bar{x}_s}{y_{sk}} : y_{sk} > 0\Bigg\}\] -
Basis Change:
The basis change formulas are based on Gaussian elimination.
Simplex Tableaus
For manual solution of a linear optimization problem, all the information can be indexed in what are called Simplex tableaus introduced in (Charnes et al., 1953). These tableaus display the explicit system in terms of a basis $\mathbf{B}$ and have the following form:
\(\left. \begin{array}{cc|cccccccccc|c} & & {\color{gray}c_1} & {\color{gray}\ldots} & {\color{gray}c_l} & {\color{gray}\ldots} & {\color{gray}c_m} & {\color{gray}c_{m+1}} & {\color{gray}\ldots} & {\color{gray}c_k} & {\color{gray}\ldots} & {\color{gray}c_n} & \\ & & x_1 & \ldots & x_l & \ldots & x_m & x_{m+1} & \ldots & x_k & \ldots & x_n & \\ \hline & z_j-c_j & 0 & \ldots & 0 & \ldots & 0 & z_{m+1}-c_{m+1} & \ldots & z_k-c_k & \ldots & z_n-c_n & \bar{z} \\ \hline {\color{gray}c_1} & x_1 & 1 & \ldots & 0 & \ldots & 0 & y_{1m+1} & \ldots & y_{1k} & \ldots & y_{1n} & \bar{x}_1 \\ {\color{gray}\vdots} & \vdots & \vdots & & \vdots & & \vdots & \vdots & & \vdots & & \vdots & \vdots \\ {\color{gray}c_l} & x_l & 0 & \ldots & 1 & \ldots & 0 & y_{lm+1} & \ldots & y_{lk} & \ldots & y_{ln} & \bar{x}_l \\ {\color{gray}\vdots} & \vdots & \vdots & & \vdots & & \vdots & \vdots & & \vdots & & \vdots & \vdots \\ {\color{gray}c_m} & x_m & 0 & \ldots & 0 & \ldots & 1 & y_{mm+1} & \ldots & y_{mk} & \ldots & y_{mn} & \bar{x}_m \\ \hline \end{array} \right.\) with the first $m$ variables being the basic variables.
In the literature, the following variations can be found:
- Row $c_j-z_j$ instead of $z_j-c_j$. In this case, the entering rule must be adapted accordingly.
- The $z_j-c_j$ row may appear at the bottom of the tableau instead of the top.
- The column with the solution and objective value may appear before the variables begin.
Numerical example
Consider the following linear optimization problem:
\[\begin{aligned} \max\;\; & z = 6x_1 + 3x_2 \\ \text{s. t.:}\;\; & 2x_1 + 4x_2 \leqslant 8 \\ & -x_1 + 4x_2 \leqslant 4 \\ & x_1 - x_2 \leqslant 2 \\ & x_1, x_2 \geqslant 0 \end{aligned}\]Its standard form (introducing 3 slack variables) is:
\[\begin{aligned} \max\;\; & z = 6x_1 + 3x_2 \\ \text{s. t.:}\;\; & 2x_1 + 4x_2 + x_3^s = 8 \\ & -x_1 + 4x_2 + x_4^s = 4 \\ & x_1 - x_2 + x_5^s = 2 \\ & x_1, x_2, x_3^s, x_4^s, x_5^s \geqslant 0 \end{aligned}\]The first Simplex tableau is:
\[\begin{array}{rc|rrrrr|r} & & {\color{gray}6} & {\color{gray}3} & {\color{gray}0} & {\color{gray}0} & {\color{gray}0} & \\ & & x_1 & x_2 & x_3^s & x_4^s & x_5^s & \\ \hline & z_j - c_j & -6 & -3 & 0 & 0 & 0 & 0 \\ \hline {\color{gray}0} & x_3^s & 2 & 4 & 1 & 0 & 0 & 8 \\ {\color{gray}0} & x_4^s & -1 & 4 & 0 & 1 & 0 & 4 \\ {\color{gray}0} & x_5^s & {\color{#2698ba}1} & -1 & 0 & 0 & 1 & 2 \\ \hline \end{array}\]where the element highlighted in blue is the pivot element, indicating which variable leaves ($x_5^s$) and which enters ($x_1$), as dictated by the third Simplex theorem.
The second Simplex tableau is:
\[\begin{array}{rc|rrrrr|r} & & {\color{gray}6} & {\color{gray}3} & {\color{gray}0} & {\color{gray}0} & {\color{gray}0} & \\ & & x_1 & x_2 & x_3^s & x_4^s & x_5^s & \\ \hline & z_j - c_j & 0 & -9 & 0 & 0 & 6 & 12 \\ \hline {\color{gray}0} & x_3^s & 0 & {\color{#2698ba}6} & 1 & 0 & -2 & 4 \\ {\color{gray}0} & x_4^s & 0 & 3 & 0 & 1 & 1 & 6 \\ {\color{gray}6} & x_1 & 1 & -1 & 0 & 0 & 1 & 2 \\ \hline \end{array}\]By continuing to satisfy the conditions of the third Simplex theorem, the entering and leaving rules determine the pivot element (in blue).
The third Simplex tableau is:
\[\begin{array}{rc|rrrrr|r} & & {\color{gray}6} & {\color{gray}3} & {\color{gray}0} & {\color{gray}0} & {\color{gray}0} & \\ & & x_1 & x_2 & x_3^s & x_4^s & x_5^s & \\ \hline & z_j - c_j & 0 & 0 & 3/2 & 0 & 3 & 18 \\ \hline {\color{gray}3} & x_2 & 0 & 1 & 1/6 & 0 & -1/3 & 2/3 \\ {\color{gray}0} & x_4^s & 0 & 0 & -1/2 & 1 & 2 & 4 \\ {\color{gray}6} & x_1 & 1 & 0 & 1/6 & 0 & 2/3 & 8/3 \\ \hline \end{array}\]which now satisfies the conditions of the first Simplex theorem, thus the optimal solution is reached:
\[(\mathbf{x}^*)^\intercal = (8/3, 2/3,0,4,0) \quad z^* = 18\]Graphically, the path followed by the Simplex is shown in the following figure:
With the Simplex Algorithm, the abstract beauty of mathematics becomes a practical tool, demonstrating that the elegance of mathematics can have a real impact on everyday life.
If you found this useful, please cite this as:
Martín-Campo, F. Javier (Feb 2026). The Simplex Algorithm, the engine of mathematical optimization. https://www.fjmartincampo.com/blog/2026/simplex/.
or as a BibTeX entry:
@misc{martín-campo2026the-simplex-algorithm-the-engine-of-mathematical-optimization,
title = {The Simplex Algorithm, the engine of mathematical optimization},
author = {Martín-Campo, F. Javier},
year = {2026},
month = {Feb},
url = {https://www.fjmartincampo.com/blog/2026/simplex/}
}
References
- Chap. Book
- Book
Enjoy Reading This Article?
Here are some more articles you might like to read next: