Information Entropy
Definition of Information Entropy
- The average rate at which information is produced by a stochastic source of data, or
- The average amount of information conveyed by an event, when considering all possible outcomes.
- Conditional entropy: with conditional probability