Abstract: The entropy of a finite probability space or, equivalently, a memoryless source is the average information content of an event. The fact that entropy is an expectation suggests that it could be quite important in certain applications to take into account higher moments of information and parameters derived from these like the variance or skewness. In this paper we initiate a study of the higher moments of information for sources without memory and sources with memory. We derive properties of these moments for information defined in the sense of Shannon and indicate how these considerations can be extended to include the concepts of information in the sense of Aczél or Rényi. For memoryless sources, these concepts are immediately supported by the usual definitions of moments; for general stationary sources, let alone general sources, no such applicable framework seems to exist; on the other hand, the special properties of stationary Markov sources suggest such definitions which are both, well-motivated and mathematically meaningful.