期刊名称:Electronic Colloquium on Computational Complexity
印刷版ISSN:1433-8092
出版年度:2000
卷号:2000
出版社:Universität Trier, Lehrstuhl für Theoretische Computer-Forschung
摘要:The majority of results in computational learning theory
are concerned with concept learning, i.e. with the special
case of function learning for classes of functions
with range {0,1}. Much less is known about the theory of
learning functions with a larger range such
as N or R. In particular relatively few results
exist about the general structure of common models
for function learning, and there are only very few nontrivial
function classes for which positive
learning results have been exhibited in any of these models.
We introduce in this paper the notion of a binary
branching adversary tree for function learning, which allows
us to give a somewhat surprising equivalent characterization
of the optimal learning cost for learning a class of
real-valued functions (in terms of a max-min definition
which does not involve any "learning" model).
Another general structural result of this paper relates the
cost for learning a union of
function classes to the learning costs for the
individual function classes.
Furthermore, we exhibit an efficient learning algorithm for
learning convex piecewise linear functions from R^d into R.
Previously, the class of linear functions from R^d into R
was the only class of functions with multi-dimensional
domain that was known to be learnable within the rigorous framework
of a formal model for on-line learning.
Finally we give a sufficient condition for an
arbitrary class F of functions from R into R
that allows us to learn the class of all functions that can
be written as the pointwise maximum of k functions from F.
This allows us to exhibit a number of further nontrivial
classes of functions from R into R for which there exist
efficient learning algorithms.