In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable