Raga Analysis Using Entropy

Soubhik Chakraborty, Guerino Mazzola, Swarima Tewari, Moujhuri Patra

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

If P(E) is the probability of an event, the information content of the event E is defined as I(E) = −log2(P(E)). Events with lower probability will signal higher information content when they occur. The probability of a raga note, and hence its information content, depends on the raga concerned. The important raga notes will obviously be having higher probabilities. On the other hand, a weak note in a raga cannot be thrown away either for it would be carrying more surprise! The strength of entropy analysis lies here (entropy is the mean information content of a random variable).

Original languageEnglish (US)
Title of host publicationComputational Music Science
PublisherSpringer Nature
Pages65-68
Number of pages4
DOIs
StatePublished - 2014

Publication series

NameComputational Music Science
ISSN (Print)1868-0305
ISSN (Electronic)1868-0313

Bibliographical note

Publisher Copyright:
© 2014, Springer International Publishing Switzerland.

Keywords

  • Discrete Random Variable
  • Information Content
  • Negative Information
  • Small Positive Number
  • Zero Probability

Fingerprint

Dive into the research topics of 'Raga Analysis Using Entropy'. Together they form a unique fingerprint.

Cite this