Cognitive musicology is a branch of cognitive science concerned with computationally modeling musical knowledge with the goal of understanding both music and cognition.
Cognitive musicology can be differentiated from other branches of music psychology via its methodological emphasis, using computer modeling to study music-related knowledge representation with roots in artificial intelligence and cognitive science. The use of computer models provides an exacting, interactive medium in which to formulate and test theories.
This interdisciplinary field investigates topics such as the parallels between language and music in the brain. Biologically inspired models of computation are often included in research, such as neural networks and evolutionary programs. This field seeks to model how musical knowledge is represented, stored, perceived, performed, and generated. By using a well-structured computer environment, the systematic structures of these cognitive phenomena can be investigated.
Even while enjoying the simplest of melodies there are multiple brain processes that are synchronizing to comprehend what is going on. After the stimuli enters and undergoes the processes of the ear, it enters the auditory cortex, part of the temporal lobe, which begins processing the sound by assessing its pitch and volume. From here, brain functioning differs amongst the analysis of different aspects of music. For instance, the rhythm is processed and regulated by the left frontal cortex, the left parietal cortex and the right cerebellum standardly. Tonality, the building of musical structure around a central chord, is assessed by the prefrontal cortex and cerebellum (Abram, 2015). Music is able to access many different brain functions that play an integral role in other higher brain functions such as motor control, memory, language, reading and emotion. Research has shown that music can be used as an alternative method to access these functions that may be unavailable through non-musical stimulus due to a disorder. Musicology explores the use of music and how it can provide alternative transmission routes for information processing in the brain for diseases such as Parkinson’s and Dyslexia as well.
The polymath Christopher Longuet-Higgins, who coined the term "cognitive science", is one of the pioneers of cognitive musicology. Among other things, he is noted for the computational implementation of an early key-finding algorithm. Key finding is an essential element of tonal music, and the key-finding problem has attracted considerable attention in the psychology of music over the past several decades. Carol Krumhansl and Mark Schmuckler proposed an empirically grounded key-finding algorithm which bears their names. Their approach is based on key-profiles which were painstakingly determined by what has come to be known as the probe-tone technique. This algorithm has successfully been able to model the perception of musical key in short excerpts of music, as well as to track listeners' changing sense of key movement throughout an entire piece of music. David Temperley, whose early work within the field of cognitive musicology applied dynamic programming to aspects of music cognition, has suggested a number of refinements to the Krumhansl-Schmuckler Key-Finding Algorithm.