If a Gau脽-Newton iteration is used to solve a system of equations that has a manifold of solutions, then the iteration does not produce the minimal norm solution. The limit of the iteration depends on the starting point. This paper introduces a modified Gau脽-Newton method that is designed to keep the nonunique part of the solution small in some sense. The iteration is analyzed. Its behavior is discussed along with two computational examples that include the iteration始s application to general integration methods for differential algebraic equations.