If P(E) is the probability of an event, the information content of the event E is defined as I(E) = −log2(P(E)). Events with lower probability will signal higher information content when they occur. The probability of a raga note, and hence its information content, depends on the raga concerned. The important raga notes will obviously be having higher probabilities. On the other hand, a weak note in a raga cannot be thrown away either for it would be carrying more surprise! The strength of entropy analysis lies here (entropy is the mean information content of a random variable).
|Original language||English (US)|
|Title of host publication||Computational Music Science|
|Number of pages||4|
|State||Published - 2014|
|Name||Computational Music Science|
Bibliographical notePublisher Copyright:
© 2014, Springer International Publishing Switzerland.
- Discrete Random Variable
- Information Content
- Negative Information
- Small Positive Number
- Zero Probability