摘要:AbstractWe propose a non-parametric regression methodology that enforces the regressor to be fully consistent with the sample set and the ground-truth regularity assumptions. As opposed to the Nonlinear Set Membership technique, this constraint guarantees the attainment of everywhere differentiable surrogate models, which are more suitable to optimization-based controllers that heavily rely on gradient computations. The presented approach is named Smooth Lipschitz Regression (SLR) and provides error bounds on the prediction error at unseen points in the space. A numerical example is given to show the effectiveness of this method when compared to the other alternatives in a Model Predictive Control setting.
关键词:KeywordsSafe learningerror boundsnonlinear set membershipnon-parametric regressionmodel predictive control