Assume f:Rk→R and g:Rk→R are continuously differentiable functions. We want to solve the following constrained minimization problem (CMP): minimize f(x)subject to g(x)=0Let ∇u=(∂x1∂u…,∂xk∂u) denote the gradient of a u:Rk→R. We say now if x∗ is a solution of the CMP and ∇g(x∗)=0, then ∃λ∈R∗ called the Lagrange multiplier such that ∇f(x∗)+λ∇g(x∗)=0 ## Remark The theorem yields the following equations: ∂xj∂(f(x)+λg(x))x=x∗g(x∗)=0=0this gives us k+1 equations for k+1 unknowns.