We present an extension to
unsupervised kernel regression (UKR), a recent method for learning of nonlinear manifolds, which can utilize leave-one-out cross-validation as an automatic complexity control without additional computational cost. Our extension allows us to incorporate general cost functions, by which the UKR algorithm can be made more robust or be tuned to specific noise models. We focus on Huber's loss and on the
ε-insensitive loss, which we present together with a practical optimization approach. We demonstrate our method on both toy and real data.