Abstract
If P(E) is the probability of an event, the information content of the event E is defined as I(E) = −log2(P(E)). Events with lower probability will signal higher information content when they occur. The probability of a raga note, and hence its information content, depends on the raga concerned. The important raga notes will obviously be having higher probabilities. On the other hand, a weak note in a raga cannot be thrown away either for it would be carrying more surprise! The strength of entropy analysis lies here (entropy is the mean information content of a random variable).
Original language | English (US) |
---|---|
Title of host publication | Computational Music Science |
Publisher | Springer Nature |
Pages | 65-68 |
Number of pages | 4 |
DOIs | |
State | Published - 2014 |
Publication series
Name | Computational Music Science |
---|---|
ISSN (Print) | 1868-0305 |
ISSN (Electronic) | 1868-0313 |
Bibliographical note
Publisher Copyright:© 2014, Springer International Publishing Switzerland.
Keywords
- Discrete Random Variable
- Information Content
- Negative Information
- Small Positive Number
- Zero Probability