Measuring the uncertainty associated to a random variable, which may represent
the lifetime of an item or a human being, is a task of great and increasing interest.
Since the pioneering work of Shannon, in which the concept of Shannon entropy
was defined as the average level of information or uncertainty related to a random
event, several measures of uncertainty with different purposes have been defined
and studied. Among them, the cumulative entropies present several advantages
due to their applicability without strict assumptions. In this talk, relations between
some kinds of cumulative entropies and moments of order statistics are
established. Then, lower and upper bounds for entropies are obtained and
illustrative examples are given. By the relations with the moments of order
statistics, a method is shown to compute an estimate of cumulative entropies and
an application to testing whether data are exponentially distributed is outlined.