摘要:We derive two oracle inequalities for regularized boosting algorithms for classification. The first oracle inequality generalizes and refines a result from Blanchard et al. (2003), while the second oracle inequality leads to faster learning rates than those of Blanchard et al. (2003) whenever the set of weak learners does not perfectly approximate the target function. The techniques leading to the second oracle inequality are based on the well-known approach of adding some artificial noise to the labeling process.