Undergraduate → Linear Programming and Optimization ↓
Duality
Duality is a fascinating concept in linear programming and optimization that provides deep insight into mathematical models and real-world problems. To understand duality, we must first have a basic understanding of linear programming problems. These problems involve optimizing a linear objective function subject to a set of linear inequalities or equations called constraints.
Understanding the root problem
Let's start by considering a linear programming problem, generally known as the "Primal problem". Suppose we want to maximize the profit given by a linear function. The problem can be expressed as:
Maximize: Z = c1*x1 + c2*x2 + ... + cn*xn
Subject to:
a11*x1 + a12*x2 + ... + a1n*xn ≤ b1
a21*x1 + a22*x2 + ... + a2n*xn ≤ b2
...
am1*x1 + am2*x2 + ... + amn*xn ≤ bm
x1, x2, ..., xn ≥ 0
Here, x1, x2, ..., xn
are the decision variables, c1
to cn
are the coefficients of the objective function, and a11
to amn
and b1
to bm
are the coefficients of the constraints.
Formulation of the dual problem
Every linear programming problem has a corresponding "dual problem". The key concept in duality is that solving the dual problem gives a bound on the value of the original problem. For the original problem stated above, the duality problem can be expressed as:
Minimize: W = b1*y1 + b2*y2 + ... + bm*ym
Subject to:
a11*y1 + a21*y2 + ... + am1*ym ≥ c1
a12*y1 + a22*y2 + ... + am2*ym ≥ c2
...
a1n*y1 + a2n*y2 + ... + amn*ym ≥ cn
y1, y2, ..., ym ≥ 0
Here, y1, y2, ..., ym
are the decision variables for the duality problem. The duality theorem states that if the original problem has an optimal solution, then the duality problem also has an optimal solution, and the optimal values of their objective functions are equal.
Visual example
Let's use a simple 2-variable system to visually explain the concept of primal and dual problems:
In this example, the shaded area represents the feasible region for a primitive problem with two decision variables. The red dashed line represents the objective function we want to maximize. The green dot marks the optimal solution, where the objective function achieves its highest value within the feasible region.
Properties of duality
The concept of duality in linear programming is marked by several important properties:
-
Weak duality: For any possible solution to the original and dual problems, the value of the objective function of the dual problem is always greater than or equal to the value of the objective function of the original problem. Formally:
Z ≤ W
-
Strong duality: If both the original and dual problems have feasible solutions, then the optimal values of their objective functions are equal:
Z* = W*
- Complementary slackness: The solution to a linear programming problem is optimal only when the conditions of complementary slackness are met. This means that for every primary constraint, either the constraint is active or its dual variable is zero.
Examples of duality in practice
Duality in linear programming is not just a theoretical concept; it has practical applications in various fields such as economics, engineering, and logistics. Let us consider some examples to understand how duality can be applied:
Example 1: Resource allocation
In a manufacturing process, we want to determine the optimal quantities of two products, A and B, given constraints on resources. Let's look at this as an elementary problem:
Maximize: Profit = 50*xA + 80*xB
Subject to:
2*xA + 4*xB ≤ 100 (Resource 1)
1*xA + 3*xB ≤ 90 (Resource 2)
xA, xB ≥ 0
In this case, the dual problem will involve finding the shadow prices of resources, which show how much the objective function will increase if the quantity of a particular resource becomes greater:
Minimize: Cost = 100*y1 + 90*y2
Subject to:
2*y1 + 1*y2 ≥ 50
4*y1 + 3*y2 ≥ 80
y1, y2 ≥ 0
Example 2: Diet problem
Another interesting application of this is to design diets that meet daily nutritional requirements at minimum cost. Suppose we have the following elementary problem:
Minimize: Cost = 3*x1 + 4*x2
Subject to:
3*x1 + 2*x2 ≥ 8 (Protein)
1*x1 + 2*x2 ≥ 6 (Vitamins)
x1, x2 ≥ 0
The dual problem involves finding the cost of meeting additional units of nutritional requirements:
Maximize: Nutrition = 8*y1 + 6*y2
Subject to:
3*y1 + 1*y2 ≤ 3
2*y1 + 2*y2 ≤ 4
y1, y2 ≥ 0
Geometrical interpretation of duality
The geometric interpretation of duality provides valuable insight into the relationship between the primal and dual problems. In the primal problem, the constraints define a feasible region, and the objective function is a line that can be moved to find the highest feasible point. In the dual problem, the constraints are represented in a sense as unit costs, and the solution is found by looking for the lowest feasible cost.
Both the primal and duality problems can be thought of as "flipping" the feasible region. The vertices of the primal feasible region define the constraints for the duality problem and vice versa.
Importance of duality
The concept of duality is important because it provides a way to check the optimality of a solution without directly evaluating every possibility. Solving either the primal or dual problem is sufficient to determine the optimal solution. This duality forms the basis of many advanced optimization techniques, including integer programming, network flow paths, and more.
Duality also provides economic interpretations for problems. It assigns a value to the constraints, often called the shadow price, which shows how much the objective function would improve if more resources were available.
In summary, duality in linear programming is a profound concept that connects optimization problems in meaningful ways. Understanding the primary and dual relationships can help provide economic insights, solve optimization problems efficiently, and provide theoretical advantages in broader mathematical studies. The beauty of duality lies in its applicability and the conceptual framework it provides for deeper problem exploration.