Convex sets guarantee that line segments between feasible points remain feasible: for any and , the convex combination lies in . Convex functions preserve this property in objective space by lying below their chords, . These properties underwrite global optimality of local solutions, a key simplification exploited by quadratic programs in Direct Transcription and regularized value updates in NMPC Overview.
Convexity can be certified via operations: intersections of convex sets or non-negative sums of convex functions remain convex. Differential tests provide quick checks: twice differentiable functions with positive semidefinite Hessian are convex; strict positive definiteness implies strict convexity and uniqueness of minimizers. Jensen's inequality and supporting hyperplanes strengthen our ability to reason about averages and subgradients when the Hessian fails to exist, as in absolute-value penalties used for soft constraints in Constraints and Softening in MPC.
import cvxpy as cp
x = cp.Variable(2)
objective = cp.Minimize(cp.norm2(x) + cp.maximum(x[0] - 1.0, 0))
constraints = [cp.sum(x) == 1]
problem = cp.Problem(objective, constraints)
problem.solve()
print("Status:", problem.status)
print("Solution:", x.value)
Convexity is powerful because it creates duality gaps of zero in nice cases and enables separation theorems used to derive KKT conditions. When models deviate from convex assumptions, we can often isolate a convex core and treat nonlinear remnants via successive linearization, as in Policy Gradient Methods or trust-region adjustments in Line Search Methods. Understanding these transitions is essential when comparing convex relaxations to iterative nonlinear solvers inside Kalman Filter Essentials and Direct Transcription.