Both linear and nonlinear regression minimize the sum of the squared residuals (SSE) to estimate the parameters. However, they use very different approaches. For linear regression, Minitab mathematically derives the minimum SSE by solving equations. Once you choose the model, there are no further choices. If you fit the same model to the same data you obtain the same results.
However, for nonlinear regression, there is no direct solution for minimizing the SSE. Thus, an iterative algorithm estimates parameters by systematically adjusting the parameter estimates to reduce the SSE. After you decide on the model, you choose the algorithm and supply the starting value for each parameter. The algorithm uses these starting values to calculate the initial SSE.
For each iteration, the algorithm adjusts the parameter estimates in a manner that it predicts should reduce the SSE compared to the previous iteration. Different algorithms use different approaches to determine the adjustments at each iteration. The iterations continue until the algorithm converges on the minimum SSE, a problem prevents the subsequent iteration, or Minitab reaches the maximum number of iterations. If the algorithm fails to converge, you can try different starting values and/or the other algorithm.
For some expectation functions and data sets, the starting values can significantly affect the results. Certain starting values may lead to failure to converge or convergence to a local, rather than global, SSE minimum. In some cases, you may need to expend considerable effort to develop good starting values.
See [4] for additional details about the algorithms and for practical suggestions about the starting values.