We propose a derivative-free method for nonlinear least-squares problems. It is a Gauss-Newton-like algorithm, in which we use simplex derivatives instead of the Jacobian of the residual function. So, whereas the Gauss-Newton algorithm aproximates F(x) by a first-order Taylor polynomial at the current iterate, our approach utilizes an affine function which interpolates F(x) in n+1 previously chosen points. An analysis about the best choice for these points is done. In order to obtain convergence results, we apply a nonmonotone line-search technique introduced by Diniz-Ehrhardt, Martínez and Raydan, in 2008. Numerical experiments are presented. |