Valiant introduced a computational model of learning by examples, and gave the precise definition of polynominal time learnability based on the model. Since then, much effort has been devoted to characterize learnable classes of concepts on this model. Among such learnable classes is the one, denoted monotone k -term DNF, consisting of monotone disjunctive normal form formulae with at most k terms. So far it has been shown [6], [8] that for fixed k , monotone k -term DNF is learnable under the assumption that positive examples are drawn according to the uniform distribution. In this paper we introduce a class of probabilistic distributions, called smooth distributions, which is generalization of all the distribution classes which appeared in literature as the ones for specific distributiion setting: A smooth distribution is the one such that a ratio of the probabilities of any two examples with Hamming distance 1 is bounded from below by the inverse of some polynomial. It is proved that monotone k -term DNF is learnable even if positive examples are drawn according to smooth distributions. From this result it follows the learnability of monotone k -term DNF under specific distribution dealt with in literature such as product distributions [7] and q -bounded distributions [2].