**Christian Benvenuti**

**Independent Researcher, Greece**

**Abstract**

There are different definitions of complexity which answer different types of questions. The usefulness of Shannon entropy as a measure of structural complexity in music has been validated by several studies and entropy is often a good indicator of the freedom of choice of a composer. This means that the higher the entropy, the higher the freedom with which the composer selected his or her materials and higher the uncertainty. A diametrically opposed approach, consisting of entirely subjective interpretations of complexity, has also been employed. The advantage of this approach is that it accounts for the listener’s personal background – everything one has been exposed to might influence how one perceives complexity. However, this approach makes it difficult, if not impossible, to talk about complexity in more objective terms.

A somewhat intermediate approach relates to algorithmic complexity (AC) or Kolmogorov complexity, which in practice measures how compressible is a sequence. A well-known method is to use LZW compression (to generate a zipped file, for example): the higher the compressibility, the lower the information content and the more orderly is the sequence. Conversely, the lower the compressibility, the higher the information content and the less predictable is the sequence. More formally, the AC of a random sequence is approximately equal to the entropy of the sequence. The rationale behind this approach can be explained by the idea that compressible information is simpler than non-compressible information. Daniel Müllensiefen et al. have successfully used AC as a predictor of difficulty in a task involving short rhythmic sequences with primary school children. However, the complexity of longer sequences cannot be adequately quantified with this method since AC does not consider how complexity varies over time.

I propose a model of compositional variability which can be applied for the measurement of complexity in sequences of any size. For such, it is proposed that complexity be a measure of the entropy of the sequence divided by the variance of Shannon information, calculated according to a time-weighted entropy difference paradigm. In the model proposed, what we might call ‘effective complexity’ is what happens between two extremes: one with pure regularity and another with pure irregularity (randomness). Effective complexity will therefore be low for pure irregularity, which tends to be cognitively simple (as in white noise, for example) but high for rich combinations between regular and irregular sequences. Relying on the flow of expectation in a sequence, the model will be discussed with music examples in light of its potential as an analytical tool that takes into consideration both structural complexity and the temporal variation of complexity in a cognitionfriendly framework.

Keywords: Complexity, Entropy, Variability, Music analysis

**Biography**

**Christian Benvenuti** is a Brazilian composer, teacher, and researcher based in Greece. Benvenuti writes acoustic, electroacoustic and multimedia works, exploring processes and methods that form a duality of determinacy and intuition. He holds a doctorate (PhD) in music from the University of Surrey and was a postdoctoral research fellow at the Federal University of Rio Grande do Sul and at the Federal University of Paraná. Benvenuti is also a consultant to the European Commission in the areas of musical composition, musicology, and musical analysis. His research interests include composition, music information dynamics, communication, music and technology, and information theory.