Assuming that the measured coordinates of the fringes of an interferogram have random errors and that they are considered Gaussian, the system of normal equations that is obtained on application of the least-squares method is converted into a nonlinear set of equations. We present an algorithm to estimate the coefficients of the nonlinear system by applying the Newton–Raphson method and starting the iteration from the standard classic solution. This algorithm is applied to a pattern of straight and equally spaced fringes, obtaining not only the right coefficients but also the adequate election of the terms to be included in the model, to show the contrast with the results of the classic method.
© 1998 Optical Society of America
Alberto Cordero-Dávila, Octavio Cardona-Nuñez, and Alejandro Cornejo-Rodríguez, "Polynomial Fitting of Interferograms with Gaussian Errors on Fringe Coordinates. III. Nonlinear Solution," Appl. Opt. 37, 7983-7987 (1998)