How music effects the perception, segmentation and interpretation of film, music and language and creates meaning.
Speaker on the annual conference of the Society of Cognitive Studies of the Moving Image (SCSMI)
Roanoke, Virginia, USA
2-5 June 2010
This project therefore aims to investigate the textual interaction of picture, language and music. The semantics of music, the effects of language and the semiotics of film add up to a unit of various carriers of meaning, and their effects are just as dependent on their individual parameters as on their connectivity. The ability of one mode to shift the meaning attributed to content in another mode is the basis of the phenomenon of anchoringdescribed for the image-text relationship by Roland Barthes. This has not been followed up to date for music because the kind of anchoring conceived of by Barthes was essentially to do with the content of image or text; for music-film cross-anchoring, a structural anchoringwill be examined that may prove of considerable utility for theoretical considerations of textuality. Music will be considered as an additional structuring device for the meanings constructed in film. By experimental cross variation of musical parameters, of filmic structure, of the filmically portrayed events, and of the verbal language employed, it will be shown to what extent the meaning of a film as a whole needs to be seen as a product of the contributing semantic possibilities of all the semiotic modes employed. The outcome of this research is intended to be a significant contribution to our understanding of the interplay between filmic event perception and music on the one hand, and the meaning creating effects of music on the other. We predict that this interplay will be best described in terms of an explicit account of the semantics of music in terms of segmentation, dependencies, similarities, rhythm and emotional tone.