This is the final entropy law from the fertile mind and mathematics of Claude Shannon of MIT and Bell Labs, who defined information as unexpected bits. (Predictable bits convey no information content, no entropy.) Information entropy is measured by its surprisal.