Mathematical tools for intermediate economics classes
Iftekher Hossain

Calculus of Multivariable Functions

Section 5

Use of the Partial Derivatives: Optimization of Functions Subject to the Constraints

Constrained optimization

Partial derivatives can be used to optimize an objective function which is a function of several variables subject to a constraint or a set of constraints, given that the functions are differentiable. Mathematically, the constrained optimization problem requires to optimize a continuously differentiable function \(f(x_{1}, x_{2}, ..., x_{n})\) subject to a set of constraints. General form of the constrained optimization problem where the problem is to maximize the objective function can be written as: $$\text{Maximize } f(x_{1}, x_{2}, ..., x_{n})$$ Subject to the constraints $$g_{i}(x_{1}, x_{2}, ..., x_{n}) = b_{i} \qquad\qquad (\text{where, } i = 1, 2, ..., k)$$ $$h_{j}(x_{1}, x_{2}, ..., x_{n}) \leq c_{j} \qquad\qquad (\text{where, } j = 1, 2, ..., m)$$ Where \(g_{i}\)'s and \(h_{j}\)'s are the constraint functions and \(b_{i}\)'s and \(c_{j}\)'s are the constraints. In a problem of constrained optimization, constraint functions can comprise a set of equality constraints like \(g_{i}\)'s, inequality constraints like \(h_{j}\)'s, or a combination of equality and inequality constraints.
In the constrained optimization problems, \(f\) is called the objective function and \(g_{i}\)'s and \(h_{j}\)'s, are the constraint functions. For simplicity and limited scope of this chapter, we will only discuss the constrained optimization problems with two variables and one equality constraint. To study examples with more variables and constraints, please read Simon and Blume, Chapter 18.


Two Variables and One Equality Constraint

Given a continuously differentiable function \(f(x,y)\) to optimize, subject to a constraint function \(g(x,y)\), mathematically we can write the problem as: $$\text{Optimize } f(x,y)$$ $$\text{Subject to } g(x,y) = k$$ Where, \(k\) is a constant.

Solution using the geometric approach

If for this function there exists an interior min or max, geometrically it requires that at the optimum point, the slope of the objective function \(f(x,y)\) equals the slope of the constraint function \(g(x,y)\). Finding this optimal point using the geometric approach is equivalent to apply the following steps:

Step 1: Find the slope of the objective function \(f(x,y)\), $$\frac{dy}{dx} = -\frac{f_{x}}{f_{y}}$$ Step 2: Find the slope of the constraint \(g(x,y)\) using \(-\frac{g_{x}}{g_{y}}\)
Step 3: By setting \(-\frac{f_{x}}{f_{y}} = -\frac{g_{x}}{g_{y}}\) find the relation between \(x\) and \(y\) which is a necessary condition to get the optimal (best) values.
Step 4: Use the relation between \(x\) and \(y\) (obtained in the step 3) in the constraint function \(g(x,y) = k\) to get the critical values.

Geometric approach (and the Lagrange method discussed below) to get the optimal values will work when there exists interior optimum points like the cases of typical utility maximization problems and cost minimization problems in economics where the objective functions take the form of Cobb-Douglas functions and the constraint functions are linear functions. In this Chapter we discuss the commonly used cases where we get the solutions at an interior point and overlook the cases where we may get the solutions at corner points or no solution at all.
To read more about the geometric solution of constrained optimization read Simon & Blume, Mathematics for Economists, P 413-415.
While mathematically the second-order conditions are crucial to check whether the corresponding extreme point is a local max, min or neither, we will not discuss the second-order conditions in this chapter. A list of books is provided at the end of this chapter which can be studied to read more about first-order necessary conditions and second-order sufficient conditions.


Lagrange Function

Suppose the constraint optimization problem is: $$\text{Optimize } \color{red}{f(x,y)}$$ $$\text{Subject to } \color{purple}{g(x,y) = k}$$ A convenient approach to solve the given constrained optimization is to form the Lagrange function: $$L(x,y,\mu) \equiv \color{red}{f(x,y)} - \mu (\color{purple}{g(x,y) - k})$$ Where, \(\mu \) which multiplies the constraint is called the Lagrange multiplier.

To find the critical points using the Lagrange function:
Find first-order partial derivatives with respect to \(\color{red}{x,y}\), and \(\color{red}{\mu }\) and set each partial derivative equals zero: $$\frac{\partial L}{\partial x} = 0$$ $$\frac{\partial L}{\partial y} = 0$$ $$\frac{\partial L}{\partial \mu} = 0$$ By solving this simultaneous system of equations, we find the critical value of the function, if any exists.

By adding an artificial variable \(\mu \), we transform the constrained optimization problem to an unconstrained optimization problem. However, this added variable has a very significant economic interpretation as it shows the marginal impact of a one-unit change in the scarce resources on the optimal outcome. (To learn more about the Lagrange multiplier read Dowling, Introduction to Mathematical Economics, Chapter Six.)

Example 1

Using the geometric approach, optimize the function \(f(x,y) = xy\) subject to the constraint \(g(x,y) = x + 4y = 120\).

Step 1: \(-\frac{f_{x}}{f_{y}} = -\frac{y}{x}\)    (Slope of the objective function)
step 2: \(-\frac{g_{x}}{g_{y}} = -\frac{1}{4}\)     (Slope of the constraint)
step 3: \(-\frac{f_{x}}{f_{y}} = -\frac{g_{x}}{g_{y}}\)    (Set slope of the objective function = slope of the constraint) $$-\frac{y}{x} = -\frac{1}{4}$$ $$\;\; x = 4y$$ step 4: From step 3, use the relation obtained between \(x\) and \(y\) in the constraint function to get the critical values. $$x + 4y = 120 \;\;$$ $$4y + 4y = 120 \;\;\;$$ $$\quad\; 8y = 120$$ $$\quad\; y = 15$$ Using \(y = 15\) in the relation \(x = 4y\), we get $$x = 4 \cdot\ 15 = 60$$


Example 2

Using the Lagrangian approach, optimize the function \(f(x,y) = xy\) subject to the constraint \(g(x,y) = x + 4y = 120\).

Form the Lagrange function: $$L(x,y,\mu ) \equiv \color{red}{f(x,y)} - \mu (\color{purple}{g(x,y) - k})$$ $$L(x,y,\mu ) \equiv xy - \mu (x + 4y - 120) \;\;$$ Set each first order partial derivative equal to zero: $$\frac{\partial L}{\partial x} = y - \mu = 0 \qquad\qquad\qquad \text{(1)}$$ $$\frac{\partial L}{\partial y} = x - 4\mu = 0 \qquad\qquad\quad\;\; \text{(2)}$$ $$\frac{\partial L}{\partial \mu } = -(x + 4y - 120) = 0 \quad\;\; \text{(3)}$$ From equations (1) and (2) we find: $$x = 4y$$ Use \(x = 4y\) in equation (3) to get: $$4y + 4y = 120$$ $$8y = 120$$ $$y = 15$$ $$x = 4y = 60$$



Creative Commons License
UWO Economics Math Resources by Mohammed Iftekher Hossain is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.