The simplest goal of general nonlinear optimization is to find a local minimum of a differentiable function. The majority of general nonlinear optimization methods are based on relaxation, i.e. generating decrease sequence $\{ f(x_k) \}_{k = 0}^\infty$, such that
$$
f(x_{k+1}) \leq f(x_k), ~ k = 0, 1, \cdots.
$$
To implement the idea of relaxation, we employ another fundamental principle, approximation. Usually, we apply local approximation based on derivatives of nonlinear functions. Local methods are discussed in general setting without convexity.