Für neue Autoren:
kostenlos, einfach und schnell
Für bereits registrierte Autoren
Table of Contents
List of Tables
List of Figures
Music to Visual Arts
Scope and Limitations
CHAPTER 1: Visual Arts and Music
1.1 Visual arts
1.1.1 Basic Elements of Visual Arts
1.1.2 Space in Visual Arts
1.2.1 Basic Elements of Music
1.2.2 Time in Music
1.3 Formal to Material Comparisons
CHAPTER 2: Music to Visual Arts
2.1 Sensations, Perceptions, and Conceptions
2.1.1 Music Cognition
2.1.2 Visual Cognition
2.2 Cognitive Correspondences
CHAPTER 3: Visualisation of Musical Sounds in Graphic Images
3.1 Motion and Emotion
3.1.1 Kinetic Cues in Visual Arts
3.1.2 Emotional Cues in Visual Arts
3.2 Visualisation of Musical Sounds in Graphic Images
CHAPTER 4: Conclusion
Table 1. Classification of visual arts according to mediums of works
Table 2. Emotional cues of music according to formal musical elements and five basic emotions
Table 3. Emotional cues of visual arts according to basic visual cues and four basic emotions
Table 4. Correspondences between visual emotional cues and musical emotional cues
Table 5. Visual emotional cues that are expected to induce basic emotions
Figure 1. Visible spectrum of light
Figure 2. Colour wheels
Figure 3. Value of light
Figure 4. Saturation of a colour
Figure 5. A two dimensional coordinate of a melody line
Figure 6. Time to amplitude plot of single frequency sound wave
Figure 7. Time to amplitude plot of complex frequencies sound wave
Figure 8. Time to amplitude plots of FM and AM vibrato
Figure 9. Frequency to amplitude plot of a sound wave
Figure 10. Takete and Mauma
Figure 11. Kinetic visual cues based on the four characteristics of neural activities in motion perception
Figure 12. Current, Bridget Riley 1964
Figure 13. Three visual cues for attention enhancement
Figure 14. Examples of graphic images for visualising musical sounds
I would like to express my special thanks and deep appreciation to my supervisor Asst. Prof. Dr. Elif Songür Dağ for her support, encouragement, and insightful advises, not only in the development of the thesis but also in my entire school period at the university. I would also like to thank Asst. Prof. Dr. Ayşe Öztürk and Assoc. Prof. Dr. Lerzan Aras, without whose generous help and mentorship this work would have never seen the light of day. Furthermore, I also want to show my appreciation to all the current and previous lecturers at Faculty of Fine Arts, Cyprus International University, with whom I was honoured to work with as a research assistant, including our beloved Prof. Dr. Oleg Nikitenko who now rests in eternal peace to our chagrin. My time at work with them have inspired and encouraged me to aspire both in academic and general ways. Also, my wonderfully enjoyable and encouraging colleagues with whom I shared my time as research assistants and also as cherished friends must deserve my profound appreciation for being a strong buttress for me at hardship. Moreover, I am also deeply thankful to my friends that come from miscellaneous parts of the world, for being my indispensable and precious supporters with or without their knowing it.
Lastly but most importantly, I would like to express my special thanks and utmost gratitude to all my family members, who have provided me with unconditional love and support throughout my life. My appreciation to them is more than words can say.
The purpose of this study was to find out the possibility of visualising musical sounds in graphic images from an objective viewpoint. In an opposition to the dominant methodological approach to human cognition of the last century, the study initially rejected an idealistic approach by mainly focusing on literatures from the field of cognitive neuroscience. Moreover, in order to discard a realist viewpoint that is the overt dialectic counterpart of idealistic subjectivity, it was shown through the analysis of formal elements of visual arts and music that there was no legitimate materialist connection between them. Nevertheless, although visual arts and music are materialistically antagonistic, human cognitive process of them elucidated important connections of visual arts and music cognition, which were motion and emotion. It was suggested that, considering the three main strata of human cognitive system, motion could connect cognition of visual arts and music in bottom-up processing of sensory stimuli, whereas emotion could connect them in top-down processing of sensory stimuli. In order to elicit musical cognitive experience in an observer of graphic images, therefore, kinetic as well as emotional visual cues that correspond to respective musical cues were proposed to be important elements of visualisation of musical sounds. However, due to the paucity of in-depth information on neurobiological processing of both motion and emotion, that of the latter more particularly, previous empirical researches had to be utilised. Specifically, it was suggested that kinetic visual cues in static graphic images that would cognitively elicit temporal perception included spatial configuration of visual elements in centrifugal direction, perceptually attentive elements, and the number of attentive locations. On the other hand, it was suggested that emotional visual cues that would correspond to emotional musical cues included value and saturation of visual elements, contour and edge configuration of shapes. Other visual emotional cues could be inferred to be useful by considering material characteristics of light and sound. Hues were suggested as effective emotional cues to connect visual and musical perceptual experiences, although they were said to show subjective variances.
Keywords: Cognition, Visual Arts, Music, Objectivity, Graphic Images
Bu çalışmanın amacı müzikal seslerin objektif bir bakış açısıyla grafik görüntülere görselleştirilmesi olasılığını bulmaktır. Son yüzyılda insan kavrayışına olan baskın metolojik yaklaşıma karşı, bu çalışma ilk olarak, ağırlıkla bilişsel nörobilim alanındaki literature odaklanan idealistik yaklaşımı reddetmiştir. Ayrıca, idealistik öznelliğin açık bir şekilde diyalektik karşılığı olan realistik bakış açısını silmek için, görsel sanatla ve müziğin biçimsel öğelerinin analizi aracılığıyla aralarında meşru bir materyalist bağlantı olmadığı gösterilmiştir. Yine de, görsel sanatlar ve müziğin birbirine karşıt olmasına rağmen, insan kavrayış süreçleri görsel sanatlar ve müzik idrağı arasında önemli bağlantılar açıklar; hareket ve duygu. İnsan idrak sisteminin üç ana tabakasına göre, duygular yukarıdan aşağı süreçle bağlarken, hareketin görsel sanatlar ve müzik idrakını duyusal uyaranların aşağıdan yukarıya işlenme süreçleri ile bağladığı öne sürülmüştür. Grafik görüntüleri gözlemleyenin müzikal kavrayış deneyimini ortaya çıkarmak için, kinetikle birlikte müziksel işaretlere karşılık gelen, duygusal görsel işaretler de müzikal seslerin görselleştirilmesinde önemli unsurlar olarak önerilmiştir. Ancak, hem duygusal hem de hareketsel nörobiyolojik süreçlere dair bilgilerin, özellikle ikincisinin azlığı nedeniyle, önceki ampirik araştırmaların kullanılması gerekmiştir. Özellikle, statik grafik görüntülerdeki kinetik görsel işaretlerin bilişsel olarak merkez-kaç yönündeki görsel unsurların mekansal yapılandırmasını içeren zamansal idrakları, algısal olarak hazır unsurları ve hazır yer sayılarını ortaya çıkarabileceği öne sürülmüştür. Öte yandan, duygusal müzikal işaretlere karşılık olacak, duygusal görsel işaretlerin, görsel unsurların, değer ve doygunluğunu, kontur ve şekillerin kenar yapılarını içerdiği öne sürülmüştür. Diğer duygusal işaretlerin ışık ve sesin materyal karakteristikleri dikkate alınarak faydalı olabileceği anlaşılmaktadır. Renk tonları ,öznel farklılıklar gösterdikleri söylenmesine rağmen, görsel ve müzikal algı deneyimlerini bağlamak için etkili duygusal işaretler olarak önerilmiştir.
Anahtar Kelimeler: Kavrayış, Görsel Sanatlar, Müzik, Nesnellik, Grafik Görüntüler
Since the ancient times onwards, music has always been influencing a herd of visual artists, enticing them into seeking artistic inspirations for their works. Numbers of visual artists and theoreticians of western societies have, in various ways, tried to elucidate the relationship between music and visual arts in any logical manner. Overall, by looking at the historical enthusiasm of visual artists in an attempt to liken their arts to music, there seems to be two interlocking characteristics of music that have precipitated them into embarking on the interdisciplinary pursuit. Namely, they can be described as sensuous effectiveness and abstract orderliness.
One of the significant characteristics of music is the strong emotional effect it can induce in listeners. That is to say, as most people's empirical knowledge would endorse it, music possesses sensuous effectiveness in causing emotional responses in listeners, and can even change their behaviours (Vergo 2005:17). Plato, in Book 3 of Republic, colloquially warned the use of music played by untrained men, in order to avoid jeopardizing the Ideal State, noting the profound emotional effects, both positive and negative, that different harmonic scales, rhythms and sounds of instruments have on a listener's mind. It might be in natural course of events, then, that visual artists have referred to music in order to achieve, in their own visual works, as strong emotional effects as that music could have exerted on listeners.
Nevertheless, the sensuosity of music was not left untouched in artistic or intellectual communities as a representation of its esoteric or inscrutable nature. Rather, it was consequently linked in logical manners, in order to be theorised, to the other significantly important characteristic of music, that is, the abstract orderliness of its elements. Boethius, a Latin author who lived around 500 AD, thought that “despite its overt appeal to senses”, the orderliness of music was what makes it a divine art as it is the “manifestation of the same mathematical laws that gave order to cosmos and governed the heavenly bodies” (Vergo, 2005: 99). By dint of this mathematical orderliness, in fact, music had long been counted as one of the seven liberal arts that were regarded as the most sophisticated and theoretical concerns among western intellectuals since classical antiquity, whereas visual arts such as painting, sculpture and architecture were not necessarily ascribed with the same intellectual value. As a consequence, by the Renaissance era, visual artists were frustrated with the debased status of their vocations compared to the unquestioned superiority that music was given as a member of the sublime liberal arts, and tried to overcome the intellectual blasphemy by pointing “to their supposed affinities with music.” An example is the proliferation of perspective drawing that was based on geometry, one of the other subjects in liberal arts that was often associated with music as well.
In order to be on par with music was thus required the elucidation of mathematical and orderly basis on which principles of visual arts were founded, the basis that would give rise to sensuous pleasure in the appreciation of works of visual arts, as it was believed to do so in music. Renaissance architects and architectural theorists, for instance, often considered the music's affective nature a result of the abstract orderliness, claiming there must also be “visual harmony capable of being appreciated in purely sensuous terms”(Vergo 2005: 158). Architecture, which by its physical nature has to deal with proportional balance of its elements not only for aesthetic but also for structural purpose, referred to the orderly proportions of musical elements such as harmonic ratio of intervals of a fifth, fourth or octave. In painting, the sensuosity of music was often associated with colours, where each colour was compared with different musical modes that in turn were associated with emotional effects. Several artists and scientists have mentioned the correspondence between the seven colours in light spectrum and the seven tones in octave, that is counted as one of the perfect intervals and used in many western music that are characterised by their emotional potentials. It was under this influence that the concept of colour organ, a mechanical instrument that produces sounds and relating colours at the same time, was conceived and later produced.
The artistic aspiration towards the incorporation of music into visual arts has become ever more rampant towards the 19th century, along with the advent of Romanticism. In Romantic era, the crisscrossing of the fields of the arts was not limited to visual arts and music, but involved all forms of the arts including literature and poetry as such. Accordingly, the attention that mathematical, or in a sense physical approach toward both music and visual arts were given was gradually abandoned and shifted to more dis/propositional, that is, narrative aspects of both music and visual arts, in the light of subjective empiricism. An impetus behind this was the rather philosophical reaction of Romantic intellectuals against dominant rationalism of the preceding era, which has led, in the field of music, to the idea that music's affective power could not be explained in the language of reason alone (Vergo, 2010: 8). Under the prevailing Romantic belief in prioritising the irrational aspect of human beings such as emotions and sensations, accordingly, visual artists' attempts to incorporate music into their works tended to be sought through “the translation of emotions from music into visual terms”, as Fantin-Latour did with the music of Wagner (Wyzewa, cited in Vergo, 2010: 39).
It was in the 20th century that the relationship of music and visual arts has started to enter the new dimension. This is due in part to the emergence of new genres or fields and according ramifications of movements in both music and visual arts, but also to the influence of turbulent technological, social, and philosophical, development behind them. In music, starting from the gaining popularity of jazz and blues, technology has precipitated the further expansion of genres such as electronic and experimental music. Evolving social norms and philosophical notions have encouraged contemporary musicians such as John Cage, Earle Brown and Phillip Glass to radically change the concept of music by taking unprecedented auditory elements like environmental sounds, noise and silence as such into their music compositions. In visual arts, not only the miscellaneous expressional forms such as Dadaism, Futurism, International style of modernism movements and proceeding postmodernism to name a few have emerged, the medium of expression they use has astronomically expanded by the dint of technological development. The rapid growth of human societies has brought with it, along with the unforeseen turmoil, the new perspective to look at the modes of vision and visuality that were accompanied with heated philosophical discussions, but also equally importantly technological possibilities that enabled those who had formerly infeasible concepts to ardently
experiment on them. Since 1960s, many artists have started working on the artistic field that is often called intermedia, an interdisciplinary field that makes use of different fields of the art. Visualisation of musical sounds has once again become one of the hot topics in the contemporary art fields, and the number of related literatures and works has proliferated. Those works span from electronic music visualisers examples of which can be seen in windows media players, i-tunes, and other computer graphic programs that accompany music, to dance performances to installation works that aim to synthesise visual and auditory experiences. Accordingly, more and more number of artists has spoken about their thoughts on the intersection of visual arts and music and the like.
Nevertheless, despite all these advancements, there seems to be almost no definite knowledge on how it is possible to perceptually visualise musical sounds in visual works, an objective model on which artists can consult in order to engage in creative works in the field of visual arts. Looking at the social and philosophical atmosphere in the 20th century, it seems to be the remaining influence of Romantic prioritisation of irrational sensuousness over mathematical rationality (Brasier 2007: 97) that is hindering the postulation of such an audacious model of cognitive visualisation of musical sounds. However, recent development of neuroscience, or is intermittently called neurobiology, has elucidated crucial clues to understanding how human brains work in terms of not only visual and auditory perceptions but of cognition in a holistic sense to experience the surrounding world from an objective viewpoint. It is in this line of thought that this thesis will tackle with the issue of visualisation of musical sounds from an objective viewpoint once again.
Moreover, it seems to be highly important to a great degree for graphic designers as well to seek the possibility of visualisation of musical sounds that is based on an objective study of human cognition. In the contemporary era, graphic design is a classless commodity, and it is omnipresent, taking manifold of forms and mediums. For this reason, whatever a formal approach and an expressive medium a designer chooses, what is the most valued aspect of a design work is more or less its aesthetic achievement (FitzGerald and Vanderlans 2010). Controversially, as graphic design is not just a visual art but also a communication art, its aesthetic achievement is judged in both visual and rhetoric aspects. Nevertheless, as the rhetoric graphic design wields is the visual rhetoric but not just that of literature in a strict sense, its focus must be directed at its aspect as a visual art. Considering the profound affective power of music that has been fascinating a herd of artists and connoisseurs since ancient times, and the extensive permeation of various genres of popular music into people's daily lives since the 20th century, then, incorporating music into design works can be a powerful assist for its aesthetic achievement.
Problematically, most of the attempts to visualise musical sounds in the contemporary era tend to do so in mobile mediums such as animations, films or computer graphics. However, if a study of visualisation of musical sounds in terms of cognitive correspondences between elements of visual arts and music can bring about a consistent result, that will engender, by cutting across the various types of mediums and fields, a new approach for visual artists, graphic designers inclusively, to achieve the aesthetic exaltation. It is for this reason that the present thesis will seek a way to visualising musical sounds in static graphic images.
The eventual aim of the present thesis is to pose a design approach of visualising musical sounds in static graphic images. In order to do so, the thesis will elucidate inherent correspondences in cognitions of visual and musical elements from an objective viewpoint. The argument will be developed according to the following assumptions:
1. there are correspondences in visual and auditory cognitions;
2. they can be stated to exist by inducing from neuroscientific studies of visual and auditory cognotions;
3. auditory sensations overlap with visual sensations, but they are not perceivable themselves (they occur in unconscious level);
4. for them to be perceivable is required kinetic perceptions;
5. resulting perceptions vary according to each perceiver's empirical knowledge, but can be generalised.
As the thesis supposes an objective third-person as an observer of visual art and music works, use of those which physical conditions of works that will possibly cause empirical differences in cognition of them must be avoided in the argument as far as possible. Therefore, the comparison of visual and auditory cognitions will be started with the primary focus on basic formal elements of visual arts and music in general, respectively. In this regard, several field-, genre-, and medium-specific differences as well as physical characteristics of various mediums are intentionally omitted from the comparison. Moreover, any narrative contents as well as elements that directly deal with narrative contents such as text, lyrics and human voice are largely ignored in order to avoid complication.
The present thesis takes a qualitative approach. Most of the forthcoming argument is based on the materials gathered from published/unpublished studies in neuroscience/neurobiology on human cognition in various forms. Nevertheless, even though this thesis will intentionally ignore the historical arguments on the relationship between visual arts and music, it will occasionally draw on information not only from other previous studies/theories in both visual arts and music but also from philosophical discussions of human cognition, due to the paucity of information in above-mentioned field of science as it is still an emerging, relatively new scientific field.
To grasp visual arts and music in an objective sense is needed a categorisation of the arts into an affordable list. However, the list of the names of fields or genres in both visual arts and music is extensively vast in the contemporary era and borders between them are often ambiguous. A categorisation of miscellaneous forms of the arts into a specific few therefore seems baffling and must entail a heated controversy. The categorical complexity, it seems, is largely due to the existence of multitudinous ways to approaching the works of the arts from various interdisciplinary viewpoints, which can be said as the remaining influence of Romantic idealism.
Here, by and large, for the sake of an adequately objective viewpoint in the era of rampant phenomenological approaches toward the study of human cognition, the notion of space and time seems to bear the utmost immediacy (Brassier 2007, Bryant et al. 2011, Meillasoux 2008). Indeed, by looking at the characteristics of the ways both visual arts and music happen to an observer from a spatiotemporal viewpoint, establishing a materialist distinction between visual arts and music seems possible.
The term “visual arts” includes various forms of the arts that often intersect with each other, including graphic design whose medium of visual expression spans from those used in painting, sculpture, films, to computer graphics in a broad sense. However, one thing must be agreed as a prerequisite for an art form to be regarded as a visual art, and that automatically necessitates a critically important condition for it: a visual art must be physiologically visible for human beings, therefore its medium of expression must exist in a physical space.
Music itself on the other hand is a purely temporal art form. According to John Blacking, music is best understood as sound that is humanly organized (Blacking 1973). That is to say, music does not involve spatial organization of physiologically visible elements since sound itself is merely an oscillating mechanical wave travelling through the air. Although some forms of the arts that can be listed as examples of visual arts, that is, those which have visual elements, such as dance, theatre and opera, are often associated with music as well, the physiologically observable visual elements in them are not sounds themselves but the objects that produce sounds, therefore must be differentiated from music itself. Hence, music itself can be defined here as a sheer temporal form of the arts that does not possess a physiologically visible object as its medium.
Granted, visual arts can be roughly defined as spatial arts, music as a temporal art. However, it must be noted here that both “spatial” and “temporal” above must be understood exclusively as physical characteristics of the works, as the distinction is based on the ways mediums of various arts happen to an observer in physical space and time. Therefore, they do not depend their spatiality or temporality on those spatial and temporal perceptions that conceptually and empirically arise in the observer’s mind in regard to the dis/propositional contents of miscellaneous works of the arts. Indeed, the latter case of spatiality and temporality, that are caused by dualistic correlations of the perceiver and the perceived, is deemed as being inevitably intersubjective in the dominant view on human cognition in the contemporary era (Bryant et al. 2011, Meillasoux 2008, Scruton 1997). To emancipate itself from this idealist conundrum, in this chapter, the thesis will first take a materialist approach to look at both visual arts and music objectively. However, the objectivity must be regarded here not as that of Cartesian opposition of subject and object, but as that which aims to be devoid of phenomenological discretion of human cognition, which can more or less be said as psychological localisation of individual perceptual events. Therefore, works of visual arts and music are considered here as the physical causes of the cognition that reside in the physical outer world, which are separated from one’s experiencing of them. For this, mathematical accountability is set as a criterion of objectivity (Meillassoux 2008). Methodologically, this chapter mainly aims to extract material characteristics of visual arts and music from conventional formal analysis of visual arts and music, which must be said as the most abstract phenomenological knowledge left over by the previous investigations into both subjects.
As is already mentioned, the indispensable prerequisite for a work of art to be regarded as that of a visual art is to have a physiologically visible element in a physical space. Therefore it is crucial to define what can possibly be regarded as being physiologically visible, or in other words as a physiological vision, for a human being, in order to approach visual arts from an objective viewpoint. In this regard, the subchapter aims to define physical characteristics of works of visual arts, followed by the enumeration of basic elements and the explication of the use of space in works of visual arts. However, these characteristics of the works of visual arts drastically change when they are to be kinetically perceived, by the dint of the subjective correlation between the works and the observer. Hence, before further delving into the characteristics of the works of visual arts in terms of the actual process of the cognition of them, a categorisation of visual arts from a materialist perspective will be provided in the following with a nominal regard to the general cognition of them.
First and foremost, as would be seen in the colloquial ways it is used in daily conversations, the word “vision” can be grasped in roughly two ways:
1. a visual image that is seen by the eye of the observer;
2. a conceptual image that arises in the mind of the observer.
Importantly, whereas the former vision belongs to a physical material image in front of the observer’s eye, in the latter case the arousal of the vision is catalysed by various narrative or non-material means including imagining, dreaming and remembering, thus the resulting vision is rather psychological than physiological (Dikovitskaya 2005). Therefore, it can be said that for something to be physiologically visible, the existence of physical material images is an indispensable requirement.
Indeed, in many works of visual arts, both aspects of “vision” explained above can be readily seen as being intertwined. One of the conspicuous traditional factors of visual arts in general is the notion of representation (Kepes 1995). In visual arts, visual images that are created in works often span from those which images that are representational to those that are non-representational (Zelanski and Fisher 2005). Notably, there seems to be at least two different functional roles in the notion of representation that can be seen in works of visual arts (Chandler 2002, Crow 2003, Zelanski and Fisher 2005, Wallschlaeger and Busic-Snyder 1992). These can be said as namely:
1. an aesthetic means of visual expression, or;
2. a semiotic means of visual rhetoric.
Here, while the former enables a formalist perspective to look at the works of visual arts, the latter entails the conceptual vision that is caused by the contents of visual representations, and thus necessitates literary perspective to looking at the works. That is to say, whereas in the former notion of representation the degree of visual representation is the matter of aesthetic choices to depict the formal characteristics of the appearance of things in reality, in the latter the degree of representation accords to the conceptual contents of visual images that is to be interpreted as signified information by the observer (Crow 2003).
Importantly, in works of graphic design, both aesthetic and semiotic aspects of visual representation seem to be highly important. On the one hand, graphic design is normally counted as a form of applied arts, together with industrial design, interior design and architecture and the like which first purpose is “to serve some useful function” (Zelanski and Fisher 2005: 25). As is apparent from the fact that graphic design is popularly defined as a commercial activity (Fitzgerald and Vanderlans 2010), it can be said that the primary function of graphic design is to achieve the effective visual communication with general public in order for graphic designers, or more specifically for clients, to attract consumers to their products as such. In this sense, visual rhetoric that is based on semiotic elements, which is usually applied from literary theory, is often utilized for the effective visual communication (Fitzgerald and Vanderlans 2010).
On the other hand, aesthetics of the works of graphic design can also be regarded as an inevitably important aspect of them. That is, apart from the formal aesthetics of graphic design works, rhetorical contents of them also rely on visual images, and thus can be approached from a formalist perspective by utterly ignoring their contents. In this sense, what is the most valued aspect of a design work can be said more or less as its aesthetic achievement (Fitzgerald and Vanderlans 2010). Consequentially, by focusing on the material forms of the works, graphic design can be regarded as a legitimate form of a visual “art.” In this line, furthermore, typographic elements, which are frequently incorporated into various graphic design works, can be accordingly distinguished into the literary contents of the words, and the formal appearance of the letters as images (Zelanski and Fisher 2005). Considering the discussion provided so far, the thesis will hence exclusively focus on the formal characteristics of physiologically visible images in works of visual arts in general, by jettisoning the semiotic approach that inevitably necessitates a literary analysis of the contents of the images.
Now, from a materialist perspective, physical characteristics of the works of various forms of visual arts, or in other words their mediums, can be classified according to two conspicuous characteristics of them (See Table 1). First, in terms of the production of works, any work of visual arts can be said as produced either on 2- dimensional medium or in/on 3-dimensional medium. Second, in terms of the temporality of the medium of works of visual arts, there can be either static or mobile visual art works. Graphic design, which medium of works differs among various graphic designers and their fields of communicational activities, can be classified more or less as a 2-dimensional visual art (Zelanski and Fisher 2005). Furthermore, although the temporality of interactive design works does not correspond to the linearity seen in works of other temporal visual art works in that the former often requires the interaction of the work and the observer, the thesis excludes it from the group of static visual arts to avoid a complication. Notably, as the thesis aims to apply the results of the comparison of visual arts and music in static graphic design images, in what follows, those which aspects of static visual arts will be mainly argued.
illustration not visible in this excerpt
Table 1. Classification of visual arts according to mediums of works. Drawn by Matsumoto, D (2014).
It must be noted here that in order to concisely list the basic elements of visual arts and explain their characteristics, the demarcation between the elements that will be discussed in the following are conceptual rather than physiologically visual. That is to say, while a point, for instance, as a basic element has no dimensionality in terms of Euclidean geometry in that it is a purely mathematical concept, when it is represented in a medium of a work of visual arts it must possess shape, tone, and size in order to be seen (Wallschlaeger and Busic-Snyder 1992). Therefore each of the basic elements that are to be listed in the following must be thought of as a formalist theoretical concept of visual arts, which in an actual work will be represented, or more particularly materialistically, visualised, as a visible image that an artist/designer chooses to create, according to the aesthetics of the image, in a physiologically visible medium. Consequentially, the subchapter aims to elucidate the physiologically visible formal elements of visual arts with materialist consideration.
Drawing from several literature sources, the thesis defines point, line, plane/shape, volume/form, texture, colour/light as the basic formal elements of visual arts (Wallschlaeger and Busic-Snyder 1992, Zelanski and Fisher 2005). Considering their characteristics, the elements can be divided into the following three groups:
1. dimensional elements;
2. tactile elements;
3. visible elements.
For this, point, line, plane/shape, volume/form fall into the first group, texture the second, and colour/light the third.
(1) Dimensional Elements
Point can be defined as the most minimal element of visual arts. Although it conceptually has no dimension, upon visualisation it attains dimensional attributes such as shape and size, in order to be seen. According to aesthetic representational choices, it can be visualised in various forms. In Euclidean geometry, it is considered as a generator of all forms (Wallschlaeger and Busic-Snyder 1992).
Line is conceptually regarded as a point in motion (Wallschlaeger and Busic-Snyder 1992). A line possesses only one dimension, as it can be conceptualised by its length, albeit not by width. Upon visualisation, it must have certain width to be physiologically visible.
Plane is to be generated by moving a line to a different direction from that a point is moved (Wallschlaeger and Busic-Snyder 1992). A plane has length and width, thus can be thought of as a two-dimensional conceptual element. In order for a plane as a conceptual element to be visualised in a work, shape must be taken into account. Shape primarily refers to contour or planar configuration of a plane, figure or object. It is usually used in terms of a two-dimensional visual element.
When a plane is moved to a yet another direction, volume is generated (Wallschlaeger and Busic-Snyder 1992). A volume has length, width and depth, therefore is a three-dimensional conceptual element. When a volume as a conceptual element is visualised in a work, form is to be considered. A form primarily refers to surface contour or volumetric configuration of a volume or a three-dimensional object. In a two-dimensional work of visual arts, volume can be represented by manipulation of other basic elements to form an illusory three-dimensional visual image. That is to say, a three-dimensional form in a two-dimensional medium is in reality visualised, or represented, with two-dimensional shapes. In a three- dimensional work of visual arts, volume can be usually visualised by actual material object that is physically three-dimensional.
To sum up, it can be said that, depending on the aesthetic representational choices, the dimensional elements of visual arts require illusory representation when they are visualised in a two-dimensional medium. Therefore, materialistically, that is, being irrespective of the notion of representation, all those formalist elements that are germane to dimensionality must be visualised as two-dimensional images in an actual work of visual arts, in order to be physiologically seen by a human eye on a two-dimensional medium of the work.
(2) Tactile Elements
Texture refers to surface quality of a two-dimensional plane or that of contour plane of a three-dimensional form. Texture can either be actual or visual. Actual texture is the physical surface quality of a physical object that is tactilely but not visually perceived, such as that of fine-grained canvas, coarse wood, smooth stone, or fluffy rag. Visual texture, or is also called simulated texture in visual arts, refers to physiologically visible surface quality. When visualised in a work of visual arts, visual texture can be representational of actual texture, or can be abstract configuration created with other visual elements, depending on and aesthetic choice. In either case, the real texture of an image in a work in reality is the actual texture of the surface of the medium, irrespective of the visual texture represented on it.
Thus, it can be said that, the texture represented in a work must be regarded as being consisted of two-dimensional visualised images of various shapes. Moreover, although some works may utilise the actual texture of mediums as a part of the intended visual expression, it is supposed to be physiologically seen but not be touched. Therefore, in this case, only visual characteristics of actual texture are to be considered as the visual element.
(3) Visible Elements
The last element, colour, bears conspicuous aesthetic immediacy in many works of visual arts. Importantly, colour is closely related to light, as it is in fact conceived of as light of certain wavelengths. In order to be physiologically visible as a colour, light must either be reflected off the surface of an object, or refracted through the surface of a medium from a light source. The range of wavelengths of light that a human eye can physiologically see is only a fraction of the spectrum of the electromagnetic radiation, therefore the range is called the visible spectrum (Zelanski and Fisher 2005). Notably, there are three primary properties of colour/light: hue, value, and saturation.
Hue refers to wavelength properties of light, which, according to their visible characteristics, are named such as red, blue, or yellow and so on. In the visible spectrum, hue of the lowest wavelength is usually named violet, while the highest being red (Figure 1).
illustration not visible in this excerpt
Figure 1. Visible spectrum of light. Drawn by Matsumoto, D (2014).
In visual arts, the use of different colours in a work is often expedited by theoretical colour notation system. For this, a colour wheel that has originally been theorised by Johannes Itten is widely used today. In a colour wheel of refracted light, red, green, blue are defined as primary hues (Figure 2 Right). When two primary hues are mixed together, secondary hues are created. A primary hue and a juxtaposed secondary hue are mixed to create a tertiary hue. In a colour wheel of reflected light, which is the most popularly used among visual artists, primary hues are defined as red, blue and yellow (Figure 2 Left). Secondary and tertiary hues can be accordingly created with these three hues. In visual arts, whereas colours of refracted light can be applied in a work by using artificial light sources or transparent coloured materials such as in a stained glass work, that of reflected light necessitates pigment substance on the surface of the medium of a work on which light is reflected.
illustration not visible in this excerpt
Figure 2 RYB and RGB colour wheels. Drawn by Matsumoto, D (2014).
The quality of the relative lightness or darkness of an area is described as value. Nevertheless, although value is conceptually the matter of quality that is to be applied in a work, physically it is the matter of the amount of light in the environment. Value of an area can span in the range of very light to very dark, the lightest being white and the darkest being black in actual works. The values in reality that result by the variation of lights and shadows in the physical environment are called local values. Conceptually, values are usually grasped in value scale (Figure 3). Difference of values can be seen most conspicuously when colour/hues are subtracted from an image, such as in black and white, or achromatic, photographs. Theoretically, when all hues of the lights of the same value are mixed, resulting colour is achromatic white, whereas the absence of any light results in achromatic black.
illustration not visible in this excerpt
Figure 3. Value of light. Drawn by Matsumoto, D (2014).
Lastly, saturation of a colour is considered as the relative amount of hue in terms of the value of the colour (Figure 4). When a hue is subtracted from a colour, the result is achromatic gray of the same value of the original colour. Therefore, saturation and value can be said as being closely related. Notably, the most saturated hue of each colour in a colour wheel has different value. In order to solve this inability to visualise the same value and saturation in terms of hues in a colour wheel, Albert Munsell has created other colour notation system called colour sphere, which is equally widely used today as is a colour wheel.
illustration not visible in this excerpt
Figure 4. Saturation of a colour, i.e. red. Drawn by Matsumoto, D (2014).
Nonetheless, importantly, hues are indeed psychological products of eyes and brains, and actual objects in the physical reality are in fact achromatic (Itten 1970). That is to say, it is not on the actual surface of the material object that intuitively looks to be tinted with a specific hue, but in the cognitive process of perceiving the light of specific frequency reflected from the surface of the object, which is where the hue is to be produced. In this sense, it must be said that it is frequency of light in the visible spectrum, rather than the perceived hues, that must be defined as material object to be physiologically seen.
Summing up the discussion so far, it can be said that when works of visual arts are analysed as materials to be physiologically seen, all of the basic formal elements of visual arts can be grasped in terms either of shape, or of light. Moreover, while points and lines can be visualised as shapes that are not representational of anything other than the shapes themselves (which may in fact nullify the representational relation), form and texture must be visualised as illusory images created by the visual organisation of shapes and light, therefore can be visualised only as two-dimensional representations of three-dimensionally visualisable concepts. On the other hand, the three formal properties of light can physically be distinguished into two: hue concerns the quality of light, and the other two are germane to the quantity of light. Moreover, importantly, hues of light, which seems to be affected by cognitive functions of the observer, merely refer to variable frequency of light in visible spectrum of light, when it is grasped in a purely physical sense. Noting the fact that the total absence of light results in the impossibility of a work to be physiologically seen other than a black mass, light, especially value of the light, therefore, seems to have more ontological immediacy than shapes that are visualised in any work of visual arts.
There seems to be at least two ways to talk about space in terms of visual arts. Firstly, it can be seen as space that pertains to the works of visual arts themselves. Secondly, there is a matter of space that concerns the correlation of the works of visual arts and the observer in the physical environment. In the following, space will be discussed in regard to the former notion, that is, how space is used in works of two-dimensional visual arts, upon the condition that they are to be physiologically seen. Indeed, this is tantamount see how elements of visual arts are used in visual art works in terms of space. However, in order to explicate only the objectively spatial characteristics of the elements, any design principle of spatial organization of elements and the like are not taken into account here. Therefore, in the line of the discussion so far, a formalist approach will at first be taken in order in what follows to explicate the characteristics of space in visual arts materialistically. For this, mathematical accountability of the spatial characteristics is occasionally taken into account in the following argument.
In two-dimensional works of visual arts, visual images are produced on a flat, two- dimensional planar medium of various shapes and sizes. Although medium of some works can in reality be three dimensional in a strict sense depending on the actual texture of surface quality, they are to be visually perceived but not tactilely so, therefore can be said to pertain no immediate conceptual hindrance as a two- dimensional material to be physiologically seen. Oblivious to the actual material of the medium, that is, formalistically, the surface where visual images are to be produced on the medium of a work is called picture plane. In actuality, a picture plane entails certain surface quality as well as a definite boundary as a medium.
The basic formal elements of visual arts listed in the previous section can be deployed on a picture plane in multitudinous ways. In general, they can be used to give location, direction, depth, and figure ground effects to the space in a picture plane (Wallschlaeger and Busic-Snyder 1992). Importantly, these spatial characteristics can assign both two-dimensional and three-dimensional spatial cues to the elements when they are represented on a picture plane.
(1) Two-dimensional Spatial Cues
When a point is created on a picture plane, it signifies the location of it in relation to the rest of the space on the picture plane. If there are other elements on the same picture plane, a point can signify the relative location to them as well. When a point is moved in a conceptual way to form a line, it possesses a directional cue. In representation, the direction of a line depends on the angle, location, as well as surface quality of the line as a shape. Nevertheless, the directional attribute of line seems to be merely the result of psychological ways an observer perceives the line (Arnheim 1974). Therefore, a represented line in an actual work can be said to possessing no material direction. Importantly, remembering that either a point or a line must be represented as a shape in an actual work, they take up certain definite area of space on a picture plane. However, depending on the contour and the relative size and colour of the shape, it can create various psychological effects or perceptual illusions.
Although both location and direction can be used as two-dimensional spatial cues in the above senses, direction seems to be rather ambiguous from an objective perspective than location, which can be mathematically defined on a two- dimensional plane by its coordinates. Moreover, the contour and area inside it where a shape takes up in a picture plane can also be said as mathematically veritable.
(2) Three-dimensional Spatial Cues
Notably, in actual works of visual arts, both location and direction can be represented as three-dimensional spatial cues as well. That is, when a point is represented in a work by possessing certain shape, size of the shape in relation to other elements in a picture plane can create perceptual effects of depth in the image. An effect of depth can also be created with a line, by manipulation of its represented shape, such as the inconstancy of width or of surface quality. Indeed, depth can be created by adequately employing points, lines, shapes, texture and colour/light, even in a highly abstract work that is not aesthetically representational of a three-dimensional image (Wallschlaeger and Busic-Snyder 1992). For instance, overlapping of a circle over another circle can signify that the former circle is spatially in front of the latter in a picture plane. The image thus created by three-dimensional depth cues in a two- dimensional format is called a monocular vision. However, this perceived depth is the result of psychological tendency of an observer to interpreting images in their simplest possible context, therefore can be materialistically said as an illusion (Arnheim 1974). Accordingly, figure ground relationship of elements in a picture plane, which refers to the psychological groupings of elements in relation to the rest of the images including the picture plane itself, and thus can be said as a three- dimensional spatial cue, must be said as an illusory spatial cue.
When a work is aesthetically representational of a three-dimensional physical image or object, the resulting image can be said as a binocular vision. Binocular vision refers to the perceptual vision that an observer can see by both of his/her eyes in the reality, and can be represented in a work by using either orthographic, oblique or perspective projection methods (Wallschlaeger and Busic-Snyder 1992). Each of these projection methods is used to create three-dimensional illusory images in a two-dimensional work by projecting three-dimensional spatial cues on the image.
In conclusion, it can be said that most of the spatial cues used in two-dimensional visual arts are, by and large, illusory effects created by the manipulation of visual elements on a picture plane. They must therefore be discarded as the objective spatial cues in that illusions, indeed, can be said as inherently subjective experiences (Gregory 1987: 337). Taking into account the discussion provided so far, then, only the location, size, area, and contour configuration of shapes on a picture plane seem to be the objectively veritable spatial cues in the actual works of visual arts.
It is already stated earlier that music is a sheer temporal art, possessing no physiologically visible element as its constituents. Nonetheless, the discourse on material instruments that produce musical sounds or physical environment where the sounds are alternated is often taken into account of the discussion on music, that more often than not is treated extensively in various ramified fields of studies that are related to music and the like. Moreover, inasmuch as visual arts, music is often regarded in terms of its cultural aspects that show multitudinous diversities in its expressive characteristics among different socio-cultural groups, that inevitably give rise to the inclusive discussions of specific types of music and the cognition of them in relation to their individual humanistic background analyses. Provided with the diverse, and often subjectively intertwined perspectives in the study of music, this subchapter, in order to eventually elucidate materialist characteristics of music as a whole for the sake of the comparison of them with that of visual arts, starts its argument by looking at formal characteristics of music that are not bound by semiotic and thus conceptually literal discreteness. For this, it will essentially focus on the sounds of music alone, for the sake of the materialist discussion on the elements and temporal organisations of music. Therefore, in accordance with the analysis of visual arts in the previous section, it is firstly required to show how phenomenological constructs can be jettisoned from music by examining its representational accounts.
Remarkably, music has often been regarded as purely non-representational art form. That is to say, in contrast to visual arts, music lacks the “two-tired structure of reference characteristic of words and images” (Cox 2011: 148). Nevertheless, this stereotypic conception of music as a non-representational art form seems to be the result of the prioritisation of vision over other senses, as the representational characteristics of music are often accounted for in direct comparison to that of visual semiotics, which results in the illogical condescending of music into the semiotic field of vision and the like. Nevertheless, notwithstanding the general conception of its representational accounts, it is said that music has its own world of representation (Cox 2011). Considering previous studies on the cognition of music (Gregory 1987, Nattiez 1990), the ways representational concepts arise in the world of music seem to be mainly stated as following three:
1. auditory representations of concepts in musical elements and organisations;
2. visual representations of organizational information of musical pieces;
3. kinaesthetic associations of sounds and their physical sources.
In the first way can be seen the representations of various literal or poetic concepts, which require literal perspective for the analysis of the cognition of musical works that incorporate words, onomatopoeias, or auditory codes which meanings are empirically acquired and may show multiplicity of interpretations among individual listeners or socio-cultural groups. Furthermore, some organizational systems of music such as melodic phrase or intonation, thematic organisation, or tonal modulations, can also be used to function as musical grammar that is analogous to the ones in language (Gregory 1987: 502-4).
Secondly, the mathematical abstractness of the elements of music has precipitated successive musicians to systematically visualise sounds for the sake of composition of musical works, and of the inherent necessity of the works to be played after the creation in order for listeners to appreciate them. Problematically, the visualisations of information of musical sounds, such that are used in western traditional formalist notation systems, or more recently in various computer programs of digital audio workstations (DAW), involve spatial visual representations of musical sounds that cannot necessarily be said to accord to cognitive uniqueness of them. Summing up, the first two representational elements must be said as irrelevant to the objective of the thesis.
Nevertheless, the third representational way of music seems to be the most controversial of all, in that it holds the very key of holding the materialist standpoint towards music. To restate the definition of music laid down earlier, music, as physically it is consisted only of audible sounds, does not possess a physiologically visible element and thus is often regarded as an immaterial, purely formal and abstract art form. For this reason, sounds of music, in conventional music analyses and also in study of acoustics, are often accounted for by the sources of sounds such as musical instruments or various phenomena in nature, or, collaterally, physical environment such as concert halls or air quality of a space that affect the resulting sounds. In other words, traditionally, sounds have long been regarded as secondary qualities of the visible objects that have caused those sounds. However, the association of sounds and their sources, which in fact is influenced by kinaesthetic perception of sounds in a given physical environment, consequentially results in semiotic relationships between them, which are usually empirically created, and thus will compromise the objective analysis of musical sounds by introducing subjectively discrete conceptual or spatial information into the cognition of the works. Indeed, apart from the Romantic influence, it seems to be due to the traditional prioritisation of vision over other sense that has been hindering one to regard something as a material that cannot be perceived visually (Cox 2011, Pallasmaa 2005). In order to exclude any semiotic and therefore conceptually literal attachments from music, however, it must be the sounds of music per se that are to be exclusively focused on as the objective elements of music.
Indeed, this very notion of sounds separated from their origins has started to attract numbers of musicians or musicologist, since the mid 20th century, under the name of “acousmatics.” Importantly, although the theoretical approach of acousmatics have in fact been influenced by idealist language of phenomenology (Cox 2011, Schaeffer 2004), acousmatic sounds, which are regarded as the materials that are distinct from their sources, allows one to conceive of sounds from materialist perspective, by treating each sound as a sonorous object (Cox 2011). Moreover, the notion of acoustmatics, by filling the gap between the traditional formal analysis of musical sounds in terms of listening subjects and the physical analysis of sounds in acoustics that bears high regards to the sources of sounds but barely to the listening subjects, seems to be best suited to the objective analysis of musical sounds in terms of the cognition of them. In line with the idea of acousmatics, therefore, the following discussion on the elements of music will be conducted on those formal elements of music that seem to be physiologically veritable, and at the same time not empirically diverse.
Here, before proceeding to the next part, it must be made clear that what this thesis defines as musical sounds may differ from the common notion of them that one may encounter in traditional musical works. That is to say, as sounds utilised in works of music in the contemporary era are enormously diverse, as can be exemplarily seen in John Cage’s famous work 4 ’ 33 that exploits environmental noises that are created by chance each time the piece is “performed,” the matter of what can be said as “musical” sounds seems to be highly ambiguous in this era. For this, making use of the notion of acousmatic sounds, especially that of musical value (Schaeffer, cited in Hodgkinson 1987), the thesis defines any sound tout court that are intended to be part of a musical work as a musical sound. In this line, any extra-contextual auditory information must be ignored in the following argument.
Among the miscellaneous literatures on music, either it be an introductory book for novice musicians or be a musicological book on music analyses, there can be found a variety of elements of music. Nevertheless, some of the oft-mentioned elements of music, for instance, the chromatic tonal system that is said to have historical importance in western classical music, are in fact the products of culturally shaped systems of music that are specific only to limited circles of ethnomusicological groups or of music genres. Moreover, even among the world of western music, along with the historical development of musical theories and influential philosophies behind them, once indispensable elements of music, such as consonance and dissonance, have lost their unambiguous candidacy as the adequately assured elements of music as such. Therefore, in keeping with the discussion in the last section, the following argument focuses only on a few of the conventional formal elements of musical sounds that seem to be physically the fundamental components of any musical work. In doing so, the thesis aims to explicate materialistically veritable characteristics of the sounds of music, by analysing each element in terms of its acousmatic aspect. Importantly, here, the consideration of temporality must be ignored in order to explicate fundamental materialist elements of musical sounds.
Firstly, by referring to several literatures on the related topics, the thesis divided commonly discussed fundamental formal elements of music into two categories, which are qualitative elements and magnitudinal element (Justus and Bharucha 2002, Kerman 1996, Levitin 2007, Martineau 2008, Nudds 2007, Schaeffer 1986, Schmidt- Jones 2008). In the first category will be discussed pitch and timbre, whereas in the second will be amplitude.
(1) Qualitative elements
It is a widely known fact that, at least among physicians and acousticians, sound is a mechanical wave travelling through the air. Irrespective of the causal source of the wave, which is usually a vibrating material object, this mechanical wave is what is to be perceived as a sound by the human ear. The rate of the vibration of the wave, which is measured in cycles, is called frequency in scientific terms, and pitch in musical terms. However, noises, whose vibrations are complex and cannot be focused on in that there is no observable periodicity, are said to have no pitches (Martineau 2008). The range of frequency that is audible to the human ear is said as approximately 20 to 20,000 cycles per second. A normal sound, which can be found in natural environment, vibrates at so-called sinusoidal component frequencies, and is consisted of fundamental frequency and several constituent frequencies (Kerman 1996: 10). Constituent frequencies, or partials in scientific terms, which are integer multiples of the fundamental frequency, are called harmonics (including the fundamental), and the complex of these frequencies is called harmonic spectrum (Justus and Bharucha 2002). Constituent frequencies that are higher (in terms of the vibrating rate) than fundamental frequency are called overtones, which may or may not be harmonics (Hyper Physics n.d.). The degree of deviation of overtones from integer multiples of fundamental frequency is described as inharmonicity.
As the incorporation of noises in musical compositions in the contemporary era shows, the matter of whether to regard a sound that has certain periodic vibration rates as a musical sound or not is in fact the matter of aesthetic choices. Indeed, in contrast to frequency, pitch must be said as the psychological construct and thus its perception is affected by, if any, other accompanying pitches (Levitin 2007). Perceptually, pitch is detected through the differences between fundamental frequency and harmonic frequencies. Therefore, pitch of the sound that is consisted of only a single frequency is difficult to be perceived. Moreover, a study shows that perception of a specific pitch can be experienced even when the fundamental frequency is removed from its harmonic spectrum (Justus and Bharucha 2002). Therefore, the existence of a variation of discrete pitches in a given musical work can be said as a result partly of psychoacoustic reaction of the listener, and hence must be regarded as a subjective experience to some extent.
The characteristic organization of harmonic spectrum is called timbre, often describing the quality of sounds in psychoacoustic studies that are produced from different sources, possessing various psychological effects (Levitin 2007). Two sounds with the same pitch may sound different depending on the timbre of the respective sounds, whose differences are usually ascribed to the musical instruments that have produced them. However, whereas timbre as a perceptual phenomenon is the result of psychoacoustic reactions that necessitate a reference to sources of sounds, as a physical phenomenon it refers merely to the variations of the organisations of harmonic spectra of sounds (see 1.2.2. Time in Music.) Moreover, notwithstanding the common view on timbre perception that the determining factor of timbre is harmonic partials of sounds, some studies suggest the effects inharmonic partials have on timbre perception (Jarvelainen, et al. 2001, Galembo, et al. 2004).
Apart from the characteristics of a pitch, organisation of several different pitches into systematized collections is also an important meta-element in music, and in western traditional music more particularly. The difference between any two pitches is called interval, which can both be sounded at the same time or sequentially in time. Among various intervals in the world of music, octave, or the variations of it, seems to be widely used among different cultures. The most conspicuous aspect of octave is that, in a series of pitches that gradually goes up in its fundamental frequencies, there is a point where a pitch sounds as if to duplicate, although in different sound quality, the other pitch that is to certain extent smaller in its vibration rates. Dividing the interval of the two analogous pitches into seven serial pitches, a collection of pitches called Diatonic scale is created, and used in many musical compositions. Indeed, all the other scale systems are created based on this musical octave, by further splitting the intervals of pitches. Importantly, in this musical octave can be seen the reason of music’s firmly alleged affinity with mathematical abstractness, in that the ratios between two pitches that are supposed to sound consonant when played together, in western classical music and non-western music alike, perfectly accord to mathematical languages (Martineau 2008). This mathematical accountability of pitch classes is applied in several visualised geometric models of pitch relationships, such as the circle of fifth or the pitch helix (Justus and Bharucha 2002).
Nevertheless, these interval ratios are adequate only among the pitches of sounds, or in other words, notes of them, that are tuned according to certain tuning systems, in that the ratio of the actually compared frequencies of sounds may vary among different tuning systems. That is to say, the concept of musical interval relationships described in mathematically accountable ratios are veritable only in visually ascertainable physical conditions such as the lengths of strings of an musical instrument, or in a visualised geometric configuration of pitches in an unified tuning system. Moreover, the simultaneous existence of multiple perceived pitches at a given moment does not, from an acoustic perspective, create the multiple sound waves, as what they make up together is a combined complex sound wave made of various frequencies. Therefore, although mathematical accountability seems to ascertain the objectivity of the musical intervals, thus, they are in fact relative phenomena among arbitrary physical situations, and are still bound by the prioritisation of visually accountable reality.
Overall, the phenomena of different audible sounds in a musical work seem to be accounted for by the organisation of frequency complexes of a combined sound wave to form psychoacoustic effects of different pitches and timbre. That is to say, various audible sounds in a musical work seem to be the result of psychoacoustic perceptual demarcation of the sound of the work as a whole into discrete sounds of miscellaneous pitches and timbres. In this sense, it is the “sound” of music as a whole that is consisted of complexes of miscellaneous frequencies, in contrast to the relational characteristics between “sounds,” that must be considered as the materialistically veritable element of musical sound.
(2) Magnitudinal element
Dynamics of musical sound affects not only the aesthetics but also the cognition of it to a large extent, and thus deserves a legitimate discourse. Notably, musical dynamics can be explained in two functional ways, first being magnitudinal exponent of the loudness of sound, and second being timing organisation of it, which indeed bears more importance than the former. As the characteristics of time in music will be discussed in the coming individual section, in the following will be briefly examined the first functional aspect of musical dynamics.
The scientific term closely related to musical dynamics is amplitude, which refers to the level of strength of sound vibrations (Kerman 1996). That is to say, in the language of science, or more specifically that of acoustics, amplitude refers to the matter of how much energy the source of sound create, that is, how much air the sound displace by its vibrations (Levitin 2007). Nevertheless, as the consideration of causal influences of a sound is supposed to be irrelevant in the argument here, physiologically audible amplitude of sound, which is independent of spatiality that can be caused by the affected area of the air, must be examined. In this sense, for one thing, amplitude of a sound becomes a relative physical condition of sound organisations in a musical work in that its magnitude can only be defined in terms of specific air pressures where a sound wave travels, or of specific loudness that is settled in advance as a reference point (Levitin 2007). Physically, amplitude of a sound is measured in decibels, and the range of amplitude physiologically audible to the human ear is called dynamic range. Secondly, amplitude of a sound as the object to be heard, or in other words the loudness of it, must be regarded as a mere psychoacoustic phenomenon in that its perceived loudness depends on concrete empirical situations where the cognition of the sound takes place (Levitin 2007, Schmidt-Jones 2007). Consequentially, it can be said that amplitude as a physical phenomenon in terms of magnitudinal exponent of sound must be regarded as the hierarchical organisation of the frequency configuration of a sound in dynamic range. In this sense, amplitude can in fact be said as a part of qualitative elements of music.
Summing up, musical sound, that is, a sound that can be used in any musical works, seem to be veritably analysed only in terms of the holistic frequency complexes of the sound as a whole in a musical work. That is to say, whereas all the formal elements of music introduced above are indispensable considerations when one is to analyse musical works in a peculiar domain of music in a traditional sense, or to compose musical works, materialistically veritable element of music must be said as only the frequency complexes of a sound wave that construct the “sound” of music.
In other words, as for material object of music cognition, sounds of music in a given work must be considered as a sound as singular instead of the sounds as plural. Pitch and timbre can be understood as the configurational elements of sound waves, and other elements such as intervals and amplitude can be considered as the relative organisational elements in the sound wave of a musical work in regard to certain span of time.
In this section, time is discussed in regard to how elements of music are organized in temporal construct of musical works. Indeed, the concept of time has been posing difficult problems to not only musicians but also intellectuals of various fields since ancient times. The reason behind the conundrum of time is the fact that time itself does not possess a physical nature that can explicitly be perceived by any physical sensory organs in any tangible manner. Therefore, perception of time is often defined as a subjective experience, most peculiarly in Kantian post-critical philosophy. For this, although ontological and epistemological discussion of time as such is not the direct subject of discussion here, in order to hold a materialist perspective, it must at least be said that the temporal organization of musical works will be examined by discarding any spatial and kinetic physical considerations that inevitably arise from dominant Einsteinian view on space-time relativity. Hence this section will examine time in music in line with the concept of physical time, that is, time as objective one- directional asymmetrical flow of scalar quantity (Halliwell et al 1996). Specifically, through an analysis of time in music in terms of temporal construct of traditional formal elements in musical works, how a piece of music as a whole can be grasped from materialist perspective is aimed to be provided.
Roughly speaking, there seem to be two ways time can be discussed in regard to the sound of music. Firstly, from macroscopic viewpoint, “sounds” can be organised in a musical work over time as discrete entities. Here, melody and rhythm will be discussed as conspicuous elements in musical organisation of sounds. Secondly, from microscopic viewpoint, time is considered as the domain where transformation of the quality of “sound” occurs. Here, dynamics will be defined as a decisive factor of the qualitative transformation of a sound. In what follows, these two characteristics of time in music are entitled as (1) sounds in time and (2) sound in time.
(1) Sounds in time
In formal music analysis or conventional music composition, musical elements mentioned in the last subchapter, together with other qualitative elements, are usually treated as discrete entities that organise a musical work in which different sounds are arranged in a complex manner. In what follows, through the examination of the basic ways sounds are organised in musical time, materialistically veritable characteristics of sounds in the temporal domain of music will be explicated.
Firstly, melody can be defined as a series of pitches that forms a melodic line in musical time (Kerman 1996). It is often described visually in a two-dimensional coordinates, where horizontal line signifies time and vertical line signifies pitch (Figure 5). Conceptually, melodic line can be drawn on the coordinates in the manner in which the “height” of a pitch accord to that of the vertical axis, and time to the horizontal axis. Indeed, melody affects the psychological perception of musical works to a great extent, in that it helps listeners to register in their minds “simple qualities of feeling instantly and strongly” (Kerman 1996: 25). Moreover, there can be more than a single melody line, and the combination of multiple melody lines are used to form consonant harmony. For this, sets of different pitches that are said to form harmonic perceptual effects are often utilised as chords, especially in western folk music.
illustration not visible in this excerpt
Figure 5. A two dimensional coordinate of a melody line. Drawn by Matsumoto, D (2014).
However, although in traditional formal musical theories the organisation of harmonic pitches are systematised due to the mathematical accountability of pitch combinations that are to be perceived as being harmonic (Martineau 2008), the perceived phenomenon of “harmony” must be said as the secondary and thus empirical quality of pitches. Thus, the use of them is merely the choice of aesthetics but not a physical necessity. In truth, there can be found some musical works that incorporate dissonant pitches, or have no melody at all (Schmidt-Jones 2008). In this sense, melody can in fact be said as a subjective experience.
The temporal organisational element of sounds that has the most immanent relations to time in musical works is rhythm. Indeed, the word “rhythm” can be used not only in music but in many other fields of arts as well, usually in a relatively intuitive manner, most peculiarly in visual arts and poems. In music, a rhythm refers to the arrangement of sounds based on their durations in a particular melody or some other musical passages (Kerman 1996). That is, as every sound is perceptually supposed to have certain duration, an arrangement of sounds of different durations into certain patterns creates characteristic rhythms in a musical work.
Rhythms are created based on meter. Meter refers to the pattern of sound organisation in a musical work, as in the case of a rhythm. However, whereas a rhythm is consisted of the combination of sounds of different durations, meter is based on the repetition of a periodical strong/weak pattern of beats, which is the basic unit for measuring time in music (Kerman 1996). In major western music theories, there are several kinds of meters, such as duple meter or triple meter, which are called simple meters, and compound meters that are the subdivisions of simple meters (Schmidt-Jones 2008). For instance, a duple meter is made of the repetition of a ONE/TWO, ONE/TWO beat pattern, where ONE’s sound stronger than TWO’s, or in other words, ONE’s are accented against TWO’s. Importantly, meter does not require every beat that consists its pattern to have an audible sound, as a meter in music is supposed to be felt by the listeners, with the help of rhythms that override the meter, but not necessarily to be actually heard. Nevertheless, meter is not a prerequisite element when sounds are to be organised to form a musical work, as some musical works do not have meter, meaning there is not recognisable periodic strong/weak patterns or pulses to be felt (Schmidt-Jones 2008). Moreover, a rhythm can be displaced from the underlining strong/weak pattern of the meter, which is called syncopation in musical terms. Syncopation is a key factor for a musical work to have unique rhythmic patterns, and there can be found a variety of peculiar ways syncopations occur in music of various ethnomusicological groups or different music genres.
Lastly, tempo of a musical work refers to the speed at which the sounds of the work are organised. Notably, whereas rhythm and meter refer to the relative duration of sounds and also that of beats where different magnitudinal classes of sounds are placed inside musical works, tempo is used as the absolute measurement of duration of sounds in relation to the physical time (Kerman 1996). Tempo is usually measured quantitatively as beats per minute, where a beat is counted by a quarter note, which has a ¼ of the duration of the sound that is termed as a whole note in commonly used western notation systems. Tempo can be specified using either metronome marking or other verbal markings, the latter of which are rather conceptual and show empirical variations compared to the former. For the former, a device called metronome that counts beats per minute tempo is generally used, while for the latter verbal descriptions are commonly used, mostly for the sake of performance of musical works, in order to signify specific tempo. Eminently, it is not necessary for tempo to be constant throughout the musical work. That is to say, some musical works have irregular tempo by gradually getting faster or slower, or even by holding sounds for a certain duration or inserting a rest in the middle of the temporal flow of the work.
Overall, the temporal organisation of sounds in a musical work can be said as being based on relative demarcations of time into which sounds of certain durations are to be placed. Melody must be regarded as the organisation of sounds into perceptually recognisable flow of sounds, which is materialistically not separated from other sounds that exist, if any, along with them. As for the durational elements of sounds in musical works, rhythm and meter seem to be mere perceptual phenomena that are recognised by the listeners only by the dint of the placement and quality of various sounds in the temporal flow of a musical work. Furthermore, although it seems to be a useful measurement tool of sound organisation by behaving like a temporal grid in a musical work, tempo is indeed also a relative element in that its pace can change in a musical work only in relation to physical time, therefore cannot be regarded as a definite element of musical time.
However, it must be remembered that, from a materialist perspective, the existence of different sounds in a musical work at a given moment are in fact, materialistically, perceptual phenomena extracted from a single sound wave, which is a compound of multitudinous frequencies. Indeed, this perspective can be applied in the temporal domain of music as well, in that the various durations of discrete sounds are in fact perceptual constructs made out of an actual sound, which does not possess any separate durations as its physical property (Kerman 1996). That is to say, for instance, even when all the sounds in a musical work are held at rest by an inserted silence as such in the middle, the perceived silence must be considered as the sound of no discernible frequency (as practically there might usually be background noise and the like) which is placed in the temporal organisation of a musical work. In this sense, in terms of the sounds to be placed into a musical work, musical sounds must be regarded as the changing configuration of a singular sound of a musical work that continuously changes over time.
(2) Sound in time
Dynamics, which was briefly introduced in the last section, is related to timing organization of sounds in a musical work as well, as is already mentioned before. However, changes of dynamics are also grasped in the sound of music itself, by gradually getting louder or softer in certain spans of time by various manners. As musical notational signs, the former is described as crescendo, the latter as decrescendo or diminuendo. This dynamic change of loudness can be applied to sounds as discrete entities as well which must be regarded as the configuration of frequencies of a sound wave of a musical work as a whole.
Dynamics of a sound over certain duration has germane relationships with its timbre and amplitude, in that the temporal pattern of its sound wave in time is a decisive aspect of both elements. For the sound whose sound wave is consisted of more than one frequency, the amplitude of the sound changes over time according to the changing air pressure that is caused by the complex of frequencies of the vibrating sound wave. That is to say, because of the different frequencies that construct the vibration of a sound wave, a sound travels from its source by constantly changing the pressure of the surrounding air by the dint of complex vibrations, and that consequentially gives rise to its characteristic timbre that is to be perceived by a listener. For this reason, sound waves are sometimes called pressure waves, where amplitude is described as atmospheric pressure, which is commonly shown in visualised frequency analysis (Figure 6).
illustration not visible in this excerpt
Figure 6. Time to amplitude plot of single frequency sound wave. Drawn by Matsumoto, D (2014).
Here, importantly, the perception of the timbre of a sound is ascribed to the characteristic shape of the envelop of its sound wave, meaning the outlining contour of the extremes of fluctuating amplitudes of the wave (Figure 7). Now, in the envelop of a periodic sound wave, there can be found four conspicuous changes of amplitude that characterise the timbre of the sound, that are namely attack, decay, sustain, and release, as can be seen in figure 7.
illustration not visible in this excerpt
Figure 7. Time to amplitude plot of complex frequencies sound wave. Drawn by Matsumoto, D (2014).
Finally, the periodic change of the pitch of a sound over certain duration is called vibrato, and is a decisive factor to characterise the timbre of the sound. Vibrato results from the modulation of either frequency or amplitude of a sound wave, which are respectively called FM (frequency modulation) and AM (amplitude modulation) (Hyper Physics) (Figure 8).
illustration not visible in this excerpt
Figure 8. Time to amplitude plots of FM and AM vibrato. Drawn by Matsumoto, D (2014).
Importantly, in spectrum analysis, although it is useful to analyse dynamics of sounds using visualised forms of sound waves, the visualised shapes of sounds must not be mistaken as the nature of sounds as such in that sound waves are longitudinal waves that materialistically do not possess any dimensional aspect in spatial terms. In this sense, it is only when physical environment, or more specifically, atmospheric pressure, is taken into consideration, that sounds can be grasped as waves in a physically explicable sense.
On the other hand, another of the commonly used ways to analyse dynamics of sounds in scientific studies is called Fourier analysis. From a technical viewpoint, while spectrum analysis is primarily used to record sound into .wav files, Fourier analysis is used to create .mp3 files that are more widely used today than .wav files. In a graph that uses Fourier analysis, a sound wave can be decomposed into each single frequency versus their individual amplitude at a given moment (Figure 9).
illustration not visible in this excerpt
Figure 9. Frequency to amplitude plot of a complex frequencies sound wave. Drawn by Matsumoto, D (2014).
As the auditory information that one receives as the sound in the end is the constituent spectra that make up a sound, Fourier analysis seems to be better suited for the analysis of the sound of music as a material object to be physiologically heard (Huggins 1999). Moreover, spectrogram, a visualised analysis tool of sounds that can be said to combine two already mentioned analysis methods by charting amplitude of decomposed frequencies in a temporal manner, is widely used today in acoustic studies. Nonetheless, parallel to spectrum analysis, either Fourier analysis or spectrogram takes into account physically relative indices of frequency versus amplitude chartings, and therefore has a chance of jeopardising the objectivity in regarding the sound of music per se as the material object, which is free of considering any causal influence that affect the sound.
To sum up, while a formal musical analysis sees music as consisted of plural sounds constructing the sounds in music, materialist standpoint regards it as made of a single sound wave, which configuration constantly changes over time. However, to regard sound as being made of a sound wave that travels in the air to reach the human ear seems to presuppose the consideration of atmospheric pressure to be taken into account, and therefore its focus will still coincide with that of conventional acoustic studies. Therefore, by remembering the discussion in the last subchapter, it can be said that if sounds as perceptual phenomena are devoid of any accompanying consideration of physical environment, which gives rise to perceptually relative loudness and also to psychoacoustic perception of timbre and demarcation of different sounds, sounds become a continuous stimulus that does not reside externally in the physical world nor internally in the mind of a listener
Finally, in the end of this chapter, an examination of correspondences that can be observed between visual arts and music is to be provided. Indeed, the extraction of materialist characteristics from formalist ones, which was the main concern of the previous discussions, is an inevitable precondition for visual arts and music, the former more particularly, to part away from the contemporary dominance of content /form dualism, or what can be named as representationalism. In fact, when many artists and musicians have started to experiment with the crisscrossing of visual arts and music, what they have aspired most earnestly was to getting rid of representational aspects from their works, by means of formalist abstractions. Nevertheless, in the light of the previous argument, whereas materialist characteristics of visual arts and music can be said to bear utmost importance in a cognitive discussion, formal characteristics seem to be indispensable only in terms of categorical analyses of them that are to be laid down verbally, or of the expedient language for the creation of works. In the following, through the disposal of formalist perspective, the lack of legitimate correspondences between materialist characteristics of visual arts and music is to be suggested.
(1) Formalist Comparison
Indeed, apart from that of materialist ones, correspondences of formal elements between visual arts and music have been rigorously analysed and theorised by quite a few artists, musicians, intellectuals, or philosophers in western history, particularly in the modernism era of the 20th century. Regarding several literatures, frequently proposed comparisons generally tend to be colour/pitch, colour/timbre, or shape/sound configuration (Cristia 2011, Kandinsky 1977, 1979, Vergo 2005, 2010). Nevertheless, as there can in fact be seen a broad variations as to how to recognise correspondences between the elements of the two art forms among different artists and/or theoreticians, the criteria for recognising the correspondences seem to be ambiguous.
Moreover, as can be observed in the works of those artists or musicians who have strived to unite both art forms in their works in modernism era, most of the elements tend to be rendered in highly abstract shapes (for visual arts) or utterly instrumental, or formal, sounds (for music). In other words, for visual arts to be compared to music, the elements of both have tended to be incorporated in their most abstractly conceived forms. (Cristia 2011) What is behind this penchant toward abstraction is the very notion of representation. That is to say, as a reaction against the era of Romantic relationships between visual arts and music, visual artists and musicians of the proceeding era, painters more particularly, have tried to get rid of the representational concepts from their works by using entirely abstract, semantically non-representational elements.
Problematically, this modern artists’ predilection to jettisoning contents of works in contra their forms, by exploiting abstract shapes or forms in their works in an attempt to discard the idea of art as representation of an outer reality, in fact, is still bound by the subjective concepts of aesthetic representation. That is to say, the aesthetic representation of formalist concepts as such seems to be still as subjectively variable as semantic representation of outer reality, in that the formalist elements are indeed based on empirical categorisations, or in other words, the interpretations (Sontag 1961), of the elements to be used in their works.
Furthermore, the very word “formal,” which binds together the art forms of different spatiotemporal domains, intuitively arouses a spatial concept, and is quite baffling when one is to juxtapose visual arts and music. In this sense, formalist perspective toward music can be said as being magnetised to the spatial domain of visual arts, by the dint of psychological perception and semiotic categorisation of discrete elements in works of music, for them to be spatially conceptualised and systematised, or to be interpreted by musicologist or performers. Therefore, formalist perspective, although it is a dominant way of analysing and creating works of both visual arts and music yet, seems to be in no way suited for an objective comparison of the two art forms.
(2) Materialist Comparison
To part away from the dominance of vision over other senses, materialist perspective must be said as best suited. That is to say, whereas formalist perspective cannot be set free from the original/representation dualism, materialist perspective enables, by focusing only on physiologically veritable material object, one to see works of visual arts and music as the “presentation” of material objects, rather than the representation of formal conceptual elements (Cox 2011, Deleuze 2003). In this sense, drawing from the previous discussions, works of visual arts must be grasped as the presentation of light, and that of music as the presentation of sound. However, this rather Deleuzian view on art works, or on paintings more particularly (Lotz 2009), may attract some criticisms in that, for instance, to accept a painting as a “presence” inevitably implies the materiality of the work “in” the painting, and is therefore still representational “of the painting” in contra other objects which are not “the” painting (Lotz 2009). Nevertheless, as is already mentioned earlier, the above line of criticism is reasonable only when the process of the creation of works is taken into account, during which discrete formal elements and specific space (or time) of a work are arbitrarily determined by the creator. Thus, for the observer, works can still be seen as the presentation of light (or sound) which boundary is determined a priori.
This may lead to, in regard to the objective of the thesis, the examination of how it is possible to cognitively visualise sound by creating the presentation of light. Indeed, light and sound have historically been associated to each other in an attempt of visualising musical sound (Vergo 2010). According to the explicated material characteristics of visual arts and music in the last sections, then, it seems that both light and sound are defined by qualitative and quantitative, or magnitudinal, property of their waveforms. First, qualitative property of light gives rise to perception of hues, whereas that of sound germinates pitches and timbres. Second, quantitative property of light is tied with value and saturation, whereas that of sound affects perceived loudness of the sound, and its dynamics when certain temporal span is taken into account. In this sense, it seems that there can be more or less recognised physical correspondences between light and sound.
However, considering the material examination of visual arts and music in terms of space and time, the association of sound and light seems to face an unsurpassable impasse. Indeed, there is a fundamental difference between light and sound as material objects to be physiologically seen or heard by a human being, when a quotidian space-time experience is considered. Following the common notion of the nature of them, there can still be recognised a conceptual correspondence in material characteristics of light and sound. That is to say, by regarding light as electromagnetic waves and sound as mechanical waves, physical correspondence are usually drawn in terms of their nature as being transmitted or travelled through the atmosphere by possessing a waveform. Nonetheless, it was already mentioned that, although sound is conceived as mechanical wave in studies of physics or acoustics, it is not transverse but logarithmic wave. Hence, it does not materialistically accord with a visually and statically describable two-dimensional wave. Moreover, it can be conceived as a wave only when atmospheric pressure is taken into account. On the other hand, light is traditionally regarded as electromagnetic radiation that “travels” through the air as waves. In this sense, it can be grasped as waves only when temporal dimension is taken into account, and whether light is consisted of waves or not is still a highly controversial topic in the field of physics (Lamb 1995, Loudon 2003, Zajonc 2003). Besides, even one accepts light as wave, it is as transverse, in contrast with longitudinal, wave that it travels from the light source (Sultzbaugh 2009). Ergo, one of the most frequent “objective,” or scientific, approaches to visualising sound, which is based on physical frequency correspondences of light waves and sound waves (Sultzbaugh 2009), must be said as inadequate for the role of uniting light and sound together, in that they must presume the nature of light and sound as waves in order to adequately compare them.
If one is to grasp sound as a sheer material object to be physiologically heard, then, sound must be regarded as the trembling of the eardrum of a perceiver. That is to say, sound as a material object, if the atmospheric pressure is not taken into consideration, must be said to occur at the very point where the listener’s ear receives the succession of trembles of miscellaneous configurations as a continuous sound. Therefore, it seems, cognitive visualisation of sound in a purely materialistic sense requires light to cause trembles to one’s eardrum (Morris and Maisto 2004), in order to be physiologically heard. Nevertheless, to cause trembles of the eardrum, or in a physical sense to change the pressure of air, by means of light seems to be highly impractical. Indeed, although on macroscopic level the influence of magnetic field on atmospheric pressure seems to be somewhat recognised (Lam et al. 2013), on microscopic, or physiological, level it must be said that sound and light are incompatible to each other. In other words, there seem to be seen no palpable correspondence between visual arts and music on the material level.
In historical attempts to relate music to visual arts, or in other words, to visualise musical sound as such, there seem to have been two notable ways to approach the objective (Alves 2005). Firstly, one would try to visualise musical sounds by translating his/her own concepts of specific musical works or sounds into visual forms (Kandinsky 1977). However, this approach requires rather phenomenological than objective perspective toward both visual arts and music, and therefore not the primary concern of the argument here. Secondly, one would visualise musical sounds by using objectively drawn comparison of physical characteristics of light and sound, which is exemplarily seen in numerous devices that range from traditional colour organs to contemporary sound visualisers (Sultzbaugh 2009). Nevertheless, as the last chapter suggests, this way of visualising sound, which is based on the supposition of physical correspondence of light and sound as possessing waveforms, is in fact a conceptually re-presented formal correspondence, as sound and light are materialistically incompatible to each other.
It seems that, the problem that arises in a physical comparison of light and sound is that, due to the material difference of both, the latter is required, especially in acoustic or phonetic analyses, to be visualised by its signals being converted to visual frequency signals via a transducer, as in the case of a spectrogram. As a consequence, conversions of audible signals to visible signals, which are usually achieved by versions of Fourier transform, might indeed fall into the correlational circle of intersubjective conceptualisation of space and time. In other words, a visualisation of sound by the help of an external transducer is inevitably affected by the conceptual correlation of sound and light, in that a conversion of signals from temporal to spatial domains necessitates a conceptual translation of physical time to physical space that is epithetic of Einsteiniean concept of space-time. Therefore, methods of visualisation of sound that are often used in scientific studies, however objective they seem, are not as objectively created as this thesis aspires to.
In this regard, to avoid empirical conceptualisation of time in respect to space, a conversion of sound to light must be assigned internally in the part of the observer, assuming it is where a “cognitive” conversion of light and sound takes place. Traditionally, transducers of sense data, or external stimulation, which in the case above is light or sound, into cognitive information of reality seem to be located at physiological sensory organs such as eyes or ears (Gregory 1987: 600, Morris and Maisto 2004). This may necessitate an examination of physiological functions of sensory organs in regard to visual and auditory cognitions respectively. Nevertheless, considering that sensory organs themselves show large variances in their physiological characteristics among individuals (Purves et al, 2001, University of Rochester 2005), focusing on physiological organs alone does not seem to suffice the need.
Here, by referring to the accounts of a few contemporary philosophers (Brassier 2007:31, Churchland 1989), the thesis supposes that human brains, or more specifically neurobiological processes in cerebral cortex, seem to hold the key to approaching a cognitive visualisation of musical sounds as such from an objective viewpoint. In truth, recent development of neuroscience has started to provide hints of unravelling enigma of human cognition that scientists and intellectuals of other fields alike have been earnestly tackling with. Furthermore, it seems to have already been alluded in traditional biological studies of human cognition that neural activities of human brains, apart from more empirical psychological influences, possess intrinsic, and significant, control over the complicated functions of human cognition (Gregory 1987, Hecht 2014, Morris and Maisto 2004). It seems therefore important to focus on the conversion of sensory signals between different sense modalities that can possibly occur in neural activities of human brains, as a transducer, in contrast to dominant psychological theories of human cognition.
Notably, as the eventual aim of the thesis is to seek plausibility of visualising musical sounds in graphic images that are to be cognitively “heard” by viewers, irrespective of the creating processes, the direction of a conversion, in a metaphoric comparison to that of physical ones, must be from visual to auditory. Hence, attention must now be cast upon how visual stimulation can cause auditory cognitive experiences in the observer. Specifically, the argument will focus on the correspondences in human cognitions of visual arts and music respectively, by considering the possibility of ascertaining an intersection of visual over auditory cognitions that can be observed in neural activities of human brains. In order to do so, the subchapter that immediately follows will firstly examine in which state of cognitive process can it be adequately said to reside an objective connection between visual arts and music. The chapter eventually aims to suggest possible ways of cognitively visualising musical sounds, or in other words, of hearing sounds by looking at visual images, in opposition to conceptual or physical representations of them.
The way one appreciates an artwork in a quotidian sense, whether it being a work of visual arts or music, involves various intricate corporeal and mental phenomena that give rise to subjective differences. Therefore, it is crucial to unravel the complication of subjective/objective accounts as far as possible. Indeed, the word “cognitive” can be largely misleading when one attempts to liberate a cognitive event from empirical intricacies. It must be said that the primary way for a human being to know the world as reality seems to have been traditionally boiled down into sensation, perception, and conception (Gregory 1987: 598-601). Inevitably, these epistemological strata have often been the subjects of rigorous philosophical and also scientific inquiries since ancient times. In the appreciation of art works as well, these three modes of knowing are closely tied with each other to provide miscellaneous experiences of appreciating even a same piece of artwork, depending on both physical and mental situational contexts. Controversially, questioning how one comes to know the world he/she lives in, the most dominant view on human cognition in the contemporary western world have been overtaken by idealistic or anti-realistic perspectives (Brassier 2007, Briant et al 2011, Meillassoux 2008). Being no exception, the notion of objectivity in art world seems to be largely demolished or ignored in the contemporary era. In an eventual attempt to oppose this, in what follows, brief explanation on sensation, perception, and conception will at first be provided, in order to define the cognitive realm in which the thesis will try to locate an objective connection between visual arts and music.
Among the three strata of human cognitive process, sensation can be said as the most rudimentary of all. Normally, sensation is examined with reference to sensory organs, and for this reason studies on sensation are often accounted for from a biological perspective. Having been predominantly conducted in the field of sensory psychology until recently, studies of sensation have started to appear in the field of neuroscience as well, with due attention to how neural activities work in terms of sensory experiences. Sensation can be said as highly responsible for giving rise to pleasure in appreciation of art works, not to mention the plethora of other sensory experiences that one encounters in his/her daily life, consciously or unconsciously.
It is said that sensation happens in more or less the same way to different people, in that the neural systems of the most of human beings (with an exception of few people with certain physiological conditions) are said to show high similarity (Morris and Maisto 2004). In a certain sense, therefore, sensation can be said as the level of pure non-intentional presence, without any cultural as well as organic discretion (Deleuze 2003).
Biologically, as an external stimulus such as light or sound reaches a sensory organ, receptor cells in the organ converts the stimulus into neural signals that eventually reach a specific area of the brain to cause a sensory experience. This process is called transduction. Converted signals from receptors of each sensory organ activate different areas of the brain, resulting in various kinds of sensory experiences such as vision, hearing, or taste. Notably, although sensation is by no doubt a physiological phenomenon, it is often regarded as a psychological phenomenon at the same time. For example, the threshold for a sensory organ to detect the weakest external stimuli is affected not only by physiological characteristic of the organ, which varies individually, but also by his/her mental situation (Gardner and Martin 2000, Swets 1961). However, this does not mean that sensation is simply a psychophysical phenomenon, but must be understood as an ineffable neurological phenomenon. In other words, sensation must be regarded as a neurological phenomenon, in which an observer’s reaction towards it, or more specifically, his/her perception of it, is affected by various physical as well as mental influences under which it takes place (Mesulam 1998).
The reason lies in the neurological fact that an external stimulus that consequently gives rise to a sensory experience can be said as a physical stimulation only until it reaches the receptor of the sensory organ. The stimulus is then converted into neural information at the receptor of the organ and travels through neural pathways, until it reaches the brain to activate a region of cerebral cortex to cause a sensory experience. This neural information, whichever kind it is, is consisted of electrochemical signals (Morris and Maisto 2004). Therefore, it must be noted that it is not the external stimuli itself such as light or sound that are to be “sensed” by the brain in a strictly neurological sense, but is mere neural information that only brain can decode. For this reason, sensation must be understood as an utterly ineffable, non-linguistic, and noncognitive (Rockwell 2001), phenomenon, which sometimes in reality will be specified with words, feelings, or emotions, such as sensory experiences of heat, pain, joy, to name a few, only a posteriori. Hence, it can be said that while a sensation itself is an actual neurological phenomenon, explicitly sensed sensations, or sensory “experiences” that arise in the brain, cannot be defined exactly as sensations themselves. In other words, it can be suggested that the pure sensations themselves only arise in the earlier processing of sensory input, until it ultimately reaches the brain to cause cognitive experiences of sensations that are affected by the later processing of the neural information.
Actually, as sensations are processed not only in sensory receptors but also in a brain, external stimuli are in fact not the only cause of arising sensory experiences. Indeed, brain alone can cause actual sensory experiences under certain circumstances (Nevid 2009). What this fact shows is that a sensory experience is not necessarily a one-way linear process that happens from sensory organ to cerebral cortex, but is largely influenced by neural activities of a human brain. Therefore, the demarcation between sensation and perception, which is often regarded as an adjacent cognitive process of sensory stimuli, seems to be innately blurred. For this reason, it is necessary for sensation to be examined together with perception that naturally interacts with the former.
Although sensation happens in more or less the same way to different people, perception occurs to people in as many different ways as the number of individuals there are. The primary reason for this is that perception is a result of interpretation and elaboration of sensation by individual person, or more specifically, by brains that store miscellaneous experiences, memory, and knowledge of both bodily and mental levels that can largely differ among people (Morris and Maisto 2004). Like sensation, perception has been mainly studied in the field of psychology, especially that of cognitive psychology, until recent development of neuroscience has paved a new path to approaching the topic from a neurobiological perspective. As being said earlier, perception is closely interconnected with sensation, giving rise to the difficulty of utterly separating each from the other.
There have been long-standing controversial discussions as to whether perception is a passive or active occurrence (Gregory 1987:599). By regarding perception as an act of giving a meaning to sensation, the former can be said to passively picking up a specific sensory stimulus to interpret it. On the other hand, perception can occur without an actual external stimulus, or sense data, by dint of neural activities in the brain. In this case, perception can be said as actively alternating the interpretation of the sensation. In either case, it must be noted that in opposition to naïve realist claim (Gibson 1972, Richards 1976), a perceived physical reality does not necessarily accord with the actual physical reality in that a perception is more or less a mental representation of the actual physical reality that results from the interpretation and alternation of sensory inputs, or, as will be discussed later, from the intervention of conceptual knowledge.
The two conspicuous ways in which interactions between perception and sensation occur are respectively called bottom-up processing and top-down processing by psychologists (Morris and Maisto 2004). Notably, while the former seems to comply with the stimulus-based neurological cognitive process of stimulus-receptor-cortex flow of sensory information, the latter process is said to flow in the opposite direction, and is therefore inevitably affected by some concepts stored in the cerebral cortex such as experience, knowledge, or memory (Morris and Maisto 2004).
In neuroscientific terms, bottom-up processing of perception is often synonymously replaced with the words low-level perception, top-down processing of perception with high-level perception. The terminological demarcation between low- and high- level perceptions doest not mean that they are simply yes-or-no experiences, as the terminology is based on the degree of how low or high it can be understood in terms of the level of perceptual process. The lowest level perception refers to perceptions that involve the early processing of raw sensory information from the various sense organs: the higher the level of perception goes the more it involves the attaching of meaning to sensory information by accessing some concepts stored in the brain in order to make sense of the situation: the highest-level operates on the processing of complex conceptual representations (Chalmers et al. 1991). Importantly, it is said that lowest-level perceptions can indeed be noncognitive (Raftopoulos 2009). It is for this reason that the border between sensation and perception must be said as inherently ambiguous, since sensations, or more specifically, sensory “experiences,” can indeed be regarded as the lowest-level perceptions of the causal sensory stimuli in a certain sense. Considering the aforementioned epistemic importance of the neural activities of cerebral cortex in human cognition, lowest level perceptions seems be a possible subject of an objective approach toward examination of sensory as well as perceptual experiences, if its process can be neurologically ascertained as being somewhat fixated. In truth, computational models of perception are often proposed as somewhat predetermined processing of sensory information into perpetual experiences (Haazebroek and Hommel 2009).
Nonetheless, it is as a matter of fact difficult to expect a purely bottom-up processing of perception to occur in a quotidian situation without any intervention from the “top” of the highly complicated cognitive system. Indeed, it is said that human perception of any given situation is affected by constant top-down influence from conceptual level (Chalmers el al. 1991). In this regard, it seems to be also necessary to examine the role conception plays in human cognition.
Apart from sensation and perception, conception takes a largely important part in human cognitive process. It is said that a conception is generally formed in a long span of time, as conceptions are considered to be in a certain sense timeless existence (Gregory 1987: 599). In this regard, conception can be said as highly subjective and individual experience, and thus have been the primary subjects of phenomenological researches in various fields of study.
A conception, or a concept, is traditionally grasped as a mentally abstracted idea from one’s experience, which is accumulated as memory, knowledge, belief, and the like. It is said that top-down perceptions are processed by one’s perceptual system rapidly drawing information from only a part of conceptual knowledge (Gregory 1987: 599). The way conceptions affect a perceptual experience in top-down process of perception is often called “cognitive penetration” in the field of cognitive science (Macpherson 2012). In the discussions of cognitive penetration is seen controversial dichotomy of the content of resulting perceptual experiences, that is, that of conceptual content and non-conceptual content (Byrne 2004, Macpherson 2012, Bermudez 2011). Importantly, although the latter seems to be somewhat oxymoronic in regard to its terminology as being non-conceptual, it must be said that it is because conceptions have usually been treated illegitimately as forms of linguistic propositions (Byrne 2004). Therefore it must be said that the opposition of conceptual and non-conceptual content is based not on whether or not there is a concept in the content of one’s perceptual experience, but the gist of the question is instead whether or not the content of one’s perceptual experience can be specified by linguistically describable concepts.
Studies suggest that perceptions can be cognitively penetrated by linguistically ineffable nonconceptual mental states to cause nonconceptual perceptual experiences (Macpherson 2012, Tye 1995). The most notable account on nonconceptual content is the claim that it is locally originated in pre-personal level of one’s mental states, which has been determined as functions of organic mechanism along with the evolutional process of a species in physical environment (Stanford Encyclopaedia of Philosophy, Tye 1995). In this sense, it seems that non-linguistic conceptions that derive themselves from pre-personal concepts, or in other words, from priorexperience of species (Turvey 1975), have indispensable influence on high-level perceptual experiences, by being largely free of first-person empirical variations (Bermudez 1998, Peacocke 2002).
Also, apart from “what” are the contents of formed concepts themselves, the process of “how” concepts are formed and interact with perceptions in high-level perceptions has started to be adequately explained in objective terms of neurological activities of human cognitive system. For instance, neural activities of several areas in cerebral cortex that operate multimodal convergence of various neural modality-specific information is now neurologically explained in terms of computational models. (Mesulam 1998) This may suggest the possible objective approach towards the top- down perceptual process. In this regard, high-level influence of conception must be taken into account with as legitimate attention as that that low-level perceptions can be ascribed, which are theoretically largely subject-independent.
In sum, it must be said that an objective approach towards human cognition must be based on all of the three primary strata of classically defined cognitive system. Specifically, taking into account the objective of the thesis, that is, a cognitive visualisation of musical sounds in graphic images that one can cognitively hear by looking at them, perceptual level seems to be most suitable for locating the intersection of vision and audition in that it is where looking at works of visual arts as a quotidian event will be cognitively experienced by an observer. For this, considering their innate influence on perception and also the availability of neurological and computational data, the account on perceptual level must be grounded on both sensory and conceptual levels as well. In this vein, in what follows, both bottom-up and top-down process of visual and auditory perception will be examined with reference to neurological accounts.
Since this chapter as a large picture aims to argue the possibility of cognitively hearing visual arts, the specific neurological process of music cognition, especially that of music perception, will be discussed before that of visual arts. It was elucidated earlier that the basic elements and sound organisations that were introduced in the previous chapter were indeed perceptual phenomena. By attending these musical elements and the like, the following will attempt to clarify the neurological system of perception of them from both bottom-up and top-down directions.
Biological process of the conversion of sound to neural information occurs at several stages (Morris and Maisto 2004). Firstly, the sound wave that enters the inner ear passes through the ear canal to reach the eardrum, causing the vibrations to it. The vibrations are then transmitted to three small bones in the middle ear. The bones, that are called hammer, anvil, and stirrup, pass the vibrations to fluid-filled organ called cochlea, which stimulates the receptor of sound, basilar membrane. Finally, the vibrations are converted to neural information through basilar membrane and travel through auditory nerves to reach the brain stem, where the neural information from the two ears meet and then is passed to auditory cortex. To summarise, it can be said that auditory information is linearly transmitted from basilar membrane to the brain, via auditory nerves (Tramo et al. 2003).
Several studies show the existence of tonopic mapping, which shows the topological distribution of areas that correspond to specific frequency configurations of sound waves, along almost all major stages of biological neural processing (Tillmann et al. 2003). Notably, tonopic mappings are found at basilar membrane (Rhode and Recio 2001), auditory nerve (Delegutte et al. 2007), cochlear nucleus in the brain stem (Delegutte et al. 2007, Tramo at al. 2003), and auditory cortex (Tillmann et al. 2003). This fact suggests the indispensable influence that bottom-up perceptual processing have on the neurobiological extraction of pitches from frequency configurations of sound waves at entire human cognitive system.
It is shown that different locations of the basilar membrane are activated by different frequencies of sound waves (Tillmann et al. 2003). Moreover, neurons, which can be said as the units of neural information, respond to different frequencies of sound waves at basilar membrane, and correspondingly fire neural signals at different rates. In this sense, it can be said that perceived characteristics of the quality of a sound wave are indeed neurobiologically differentiated already at the receptor of the human ear.
Conspicuously, the phenomena of missing fundamentals (see 1.2.1) can be accounted for by neural processing of auditory input at as early stage as basilar membrane. In an experiment on perception of missing fundamentals, perceptual extraction of a specific pitch from certain sound wave whose harmonic spectrum lacks its fundamental frequency was shown to be largely dependent on the other component frequencies of the harmonic spectrum that show harmonic relationship to the frequency of missing fundamental (Rhode and Recio 2001). Moreover, the same experiment, which was specifically conducted using AM modulation of a sound wave, suggests that at basilar membrane the envelop of sound waves tends to be automatically corrected to the ideal sinusoidal modulations (Rhode and Recio 2001). These results indicate the role that basilar membrane plays on the discrimination of pitch, dynamic, and timbre of sound waves at almost lowest level of neural processing.
Similar way of virtual pitch extraction from the frequency configurations of sound waves occurs at later stages of neurobiological processing as well (Delegutte et al. 2007). Indeed, bottom-up processing of auditory stimuli faces extensive mechanism of neural processing before the signals finally reach the cerebral cortex (Griffiths 2003). These neural mechanisms are effective especially for processing of complex sound waves. In neurobiological processing of complex sound waves that include several simultaneously sounded tones, for instance, neural activities of auditory system show automatic cognitive process that determines the perception of interval relationship between pitches as being consonant or dissonant (Tramo et al. 2003). Adding to the aforementioned case of missing fundamentals, then, it can be argued that qualitative formal elements of music (see 1.2.1) are by and large neurobiologically determined at relatively early stages of cognitive process.
One of the most significant aspects to be considered next is that auditory signals are processed in temporal as well as spatial domains at many stages of neural process (Delegutte et al. 2007, Griffiths 2003, Krumhansl and Toiviainen 2003, Tillmann et al. 2003). A study that was conducted on monkey auditory cortex suggests that there are possibly “what” and “where” pathways in neurobiological cognitive process of auditory signals in human auditory cortex as well (Rauschecker and Tian 2000). Although the result of the study is still only suggestive of their existence, which are similar to visual cortex, it seems to highlight the existence of separate processes of auditory signals in not only temporal but also in spatial domain. It is said that perception of spatio-temporal auditory cues are processed at earlier stages of auditory cognitive system, such as at auditory nerve or cochlea nucleus at brain stem (Delgutte et al. 2007).
Nevertheless, it seems to be at brainstem onward that shows eminent activities in spatiotemporal perception of auditory information (Skoe and Kraus 2010). Especially in the cerebral cortex, hemispheric differences are observed regarding neural processing of auditory information in both spatial and temporal domains (Liegeois- chauvel et al. 2003). For this, it must be considered here that as psychologists in the 19th century have claimed, all sensory systems convey by and large four types of sensory information; modality, location, intensity, and timing (Rosen and Rosen 2006). Auditory sensation is no exception in this regard, and this claim can explain spatial perception of auditory signals such as pitches having a spatial dimension as in musical scales (Liegeois-chauvel et al. 2003).
In terms of hemispheric differences, neurobiological studies of cerebral cortex, or specifically that of auditory cortex, show that spatial auditory information is primarily processed in the right hemisphere of the auditory cortex, and temporal auditory information the left hemisphere (Liegeois-chauvel et al. 1999, Liegeois- chauvel et al. 2003). That is to say, the right cerebral hemisphere is more sharply tuned to spectral information of frequency of sound waves, whereas the left cerebral hemisphere shows more sensibility to temporal cues of auditory information (Griffiths 2003, Liegeois-chauvel et al. 1999, Liegeois-chauvel et al. 2003). Interestingly, it is said that perception of pitch is primarily specialised in right hemisphere of auditory cortex (Zatorre 2003). This may suggest that pitches are neurobiologically processed as spatial information. However, it seems to be still largely presumptuous to draw a categorisation of the musical elements based on spatial and temporal neurological processes here, considering the paucity of currently available data that show clear-cut topology of cortical and subcortical areas and their neurological functions. Indeed, other studies show that although pitches seem to be processed as spatial features of auditory stimuli, dynamics, or temporal structure of each pitch or sound, is neurobiologically processed as temporal features (Griffiths 2003).
Nevertheless, it must be noted that the asymmetrical functions of auditory cortex strongly suggests the separate neural processing of auditory information of spatial and temporal features. Indeed, among the formal elements of music that were mentioned in the last chapter, pitch, timbre, melody, harmony, rhythm, tempo, meter (Parsons 2003), and dynamics (Griffiths 2003) are all differentiated in the brain neurobiologically. However, although perception of each of them activates different areas of the auditory cortex and its subareas, the areas that are dominantly activated at a given time depends on the focus or the attention of the listener. Therefore, it is implausible to utterly separate the musical elements into either spatial or temporal domains in terms of neurobiological processes. For instance, it is said that temporal coding of pitch relationship can influence the perception of consonance or dissonance in the vertical, or in other word spatial, dimension (Tramo et. al, 2003). Moreover, cortical and subcortical areas that are activated when a listener perceives personally familiar or unfamiliar rhythmic patterns are shown to be different (Parsons 2003). In this term, there can be suggested two conspicuous cognitive influences on the spatiotemporal perception of auditory stimuli. Firstly, spatial as well as temporal auditory cues that brain extracts from auditory stimuli are constantly and interchangeably affected by neural activities of both localising and temporal coding of auditory stimuli, resulting in proprioceptive perception of auditory stimuli. Secondly, the effect that a listener’s familiarity to the stimuli has can be accounted for by the cognitive penetration from the top of the cognitive system.
For spatiotemporal perception of auditory stimuli, it is said that although processing of spatial and temporal features seem to be independently engaged at earlier stages, both are integrated at later stages (Peretz and Kolinsky 1993). A fact to be considered here is that neural processing of auditory stimuli at brain stem is not hardwired (Skoe and Kraus 2010) and that proprioceptive stimuli such as that are used for motion detection and the like pass through the brain stem and also primary sensory area, where most of sensory information first arrives. In this sense, it can be guessed that at later stages of auditory process, spatial and temporal perceptions of auditory stimuli are flexibly activated under the influence of more integrated sensory system as a whole.
On the other hand, top-down influence of perception cannot be neglected at later stages of auditory processing. A study suggests that all auditory perception involves working, or short-term, memory mechanisms (Zatorre 2003). For instance, spectral features of auditory stimuli seem to engage right than left hemisphere of secondary auditory cortical areas not only in perception of actual physical stimuli but also in imaginary task of playing well-known sounds in one’s mind’s ears (Zatorre 2003). Hence, it must be said that high-level process of mental recollection of a certain auditory melody actually activate a corresponding cortical area that normally responds to actual sensory stimuli.
It is proved that lesions in auditory association cortex of both hemispheres, right more particularly, cause deficits in perception of pitches, and also of consonance (Tramo et. al, 2003, Zatorre 2003). Importantly, perception of roughness of stimuli was shown to be unaffected by lesion of association area (Tramo et, al. 2003). Additionally, processing of temporal features of auditory stimuli was shown to be impaired as well as a result of association cortex lesion. Indeed, association cortex of not only auditory but of every other sense modality seems to possess close connection with memory mechanism. In a large picture, it is proved that lesion in prefrontal association cortex results in an impairment of motor tasks (Halpern 2003, Newsome 1988), thus strongly suggests the influence of auditory short-term memory (STM) on even one’s unconsciously executed event in his/her quotidian life.
Drawing from the above discussion, it can be said that while formal musical elements can be neurologically differentiated at early stages of bottom-up neural processes, deficits in auditory association area can severely impair the perception of them. Later stages of bottom-up neural processing of auditory stimuli seem to be responsible for spatio-temporal perception of auditory cues. For this, taking into account the fact that perceptual impairments can be neurophysiologically caused by lesion of some cortical areas, the influence of short-term memory on high-level perception can be neurologically veritably explained. Notably, roughness of sound seem to be unaffected by lesion of cortical areas. Therefore, considering that roughness of sound, which is acoustically explained by dynamic configuration of sound wave, entails perception of sound in a certain temporal duration, it can be guessed that spatial processing of basic elements of music (see 1.2.1) are more likely to be affected by higher neural processing in association areas than temporal processing of temporal cues of music (see 1.2.2) are.
In line with the last section, the following will focus on neurological system of visual cognition. In opposition to auditory cognitive system, human cognition, perception more particularly, of visual world has been extensively discussed in the past centuries. Especially, the number of literatures on biological examinations of human eyes as the loci of visual system seems to be abundant enough to be covered here in minute details. Therefore, in order to draw a comparison of visual system to auditory system that was discussed earlier, the focus of the discussion here will mainly be on neural activities of human visual system in general.
From a traditional biological perspective, visual system at the earliest stages is usually explained as follows. When light enters a human eye, it passes serially through cornea, aqueous humour, pupil, lens, and then vitreous humour, before it finally reaches retina (Tombran-Tink and Barnstable 2008). In the retina are situated receptors of visual stimuli, called photoreceptors, which are divided in to rod photoreceptors and cone photoreceptors. Importantly, cone photoreceptors are sensitive to the differences of wavelengths of light that give rise to chromatic information, while rod photoreceptors show acuity to the luminance level of light. The visual stimuli that are received by these two types of photoreceptors are transduced into neural information, and passed to travel through optic nerve to finally reach the brain.
On the other hand, neuroscientific studies of human visual system claim that there are three major types of neural visual streams starting from retina that decode several numbers of attributes of visual information such as colour, size, or edge orientation. They are respectively called M (mangnocellular), P (parvocellular), and K (konoicellular) streams. It is supported by evidence that neural information that is converted from visual stimuli is diverged to M and P streams at the earliest stage of bottom-up visual system (Kaplan 2003). Steady evidence as to where K stream originates at retina is seemingly yet to be discovered. In general, cells of M stream are said to be more sensible to luminance contrast than that of P stream are, while cells of the latter are tuned more to spatial sampling (Kaplan 2003). Therefore, it can be said that luminance and spatial information of visual world are neurologically separately processed.
Chromatic information of light is primarily sent via P stream, and passed through lateral geniculate nucleus (LGN) until it reaches visual cortex (Kaplan 2003). Hence, spatial and chromatic features of visual stimuli are said to travel through the same stream at earlier stages of visual system. Indeed, many cells in P stream receive both spatial and chromatic information, although other cells in P stream receive only either of them (Kaplan 2003). Moreover, chromatic information is neurologically processed in groups either of red/green dimension and blue/yellow dimension, colours that correspond to contrasting hues in formal description of them in a colour wheel. Notably, cells that receive chromatic information are not all-or-nothing-ly distributed to each stream, but all the streams seem to have some spectrum sensitive cells whose sensitivity changes according to overall luminance level. Indeed, in K stream can be seen some cells that receive, though limited only to light of short wavelengths (blue) (Szmajda et al. 2008), spectral information of light (Kaplan 2003). Therefore it can be guessed that K stream is sensitive to blue/yellow chromatic dimension, and that they are somehow processed independently from red/green in neural activities of earlier cognitive stages.
“What” and “where” pathways that were mentioned in the last section have been more extensively studied in visual system. It is said that above-mentioned three types of streams provide initial input to what and where pathways (Ungerleider and Pasternak 2003, Zeki and Shipp 1988). Conspicuously, P stream is said to feed what pathway, and M stream is to feed where pathway, thus corresponding to the suggestion that cells of P stream are primarily dedicated to the perception of form and colour, and that of M stream are to the detection of motion and luminance (Kaplan 2003). In this sense, it can be said that the materialistic elements of visual arts (see 1.1), as well as chromatic formal description of light, are by and large processed separately at earlier stages of bottom-up visual system.
Nevertheless, separating the functions of three pathways is adequate only if the stimuli can be said as directly creating the cognised image of the visual world. In turn, the truth is that the information that is sent through the streams is ultimately received by subcortical and cortical areas, and is hence inevitably processed further in a complex and nonlinear manner. Specifically, all the neural information sent from the retina is relayed through LGN in midbrain and then through an area of visual cortex that is conventionally called V1 (Bullier 2003). A study supports that all the streams that start from the retina pass through LGN in parallel (Casagrande and Xu 2003). The three streams travelling from the retina are projected to layers of V1 in a variable manner through at least, and possibly more than, four different channels, as a result of some function of interloping LGN. As a consequence, various attributes of visual information such shape, form, colour, and motion are neurologically segregated at V1.
Concerning visual cortex V1, important findings on neural activities that are related to what and where pathways of visual system were discovered. Namely, spatial visual information sent through P stream is connected to temporal layer of V1, while M stream which carries neural information concerning location or movement of visual object is connected to parietal layer of V1 (Casagrande and Xu 2003). K stream is connected to morphologically different area of V1 (Casagrande and Xu 2003). For this, suggestions are made to claim that cells in K stream at LGN onward carry scalar variable information such as brightness, colour contrast, and texture, in contrast to geometrical variables that are carried in other streams (Allman and Zucker 1990).
Summing up the discussions so far, it is highly probable that all the formal visual cues, together with temporal movement as such, are neurologically segregated until neural information reaches the primary visual cortex. Moreover, spatial and temporal perceptions of visual stimuli seem to be diagonally processed in higher cognitive system V1 onward.
The information that is processed at V1 is ramified and feedforwarded through nearby areas and ultimately passed to additional cortical areas such as V2, V3, and so on. In the feedforward connections between V1 and other cortical areas, the information is processed in either vertical or horizontal ways (Bullier 2003, Tucker and Fitzpatrick 2003). Regardless of specifications of functions of each cortical area in terms of visual system, it must be noted that connections are observed between not only adjacent areas but also among long distance, intrahemispheric areas (Bullier 2003). Moreover, apart from feedforward connections, feedback connections are found especially between intrahemispheric cortical areas, and in a different way from that of feedforward processes. Hence, it must be said that the ways the cortical areas are connected when feeding information are largely variable in each area. These facts exemplify the complexity of neural activities in human visual system.
As the specifications of the functions of cortical areas seem to have already been discussed extensively in the field, no need of rehearsal might ne needed here. Indeed, specific functions of respective cerebral areas seem to be so complex that to show a clear-cut segregation of them according to visual elements seems to be misleading (Zeki 1993). However, some of cerebral regions seem to deserve a mentioning in terms of top-down process of visual cognition. One of the most significant topics related to high-level visual perception is visual imagery (Bertolo 2005, Cattaneo et al. 2009, Klein 2000, Knauff et al. 2000). The question concerns whether an imagining task of visual mental imagery that is not caused by actual retinal input of visual sensory stimuli activates the same area of primary visual cortex as that which is activated by the perception of actual visual sensory stimuli. Indeed, experiments suggest that having a visual imagery is possible even for those who are congenitally blind (Bertolo 2005). That is to say, to imagine a mental visual imagery is not required the perception of actual visual stimuli or even prior visual perceptual experience, and that neural activities of cortical areas alone can create a visual imagery in one’s mind’s eye (Bertolo 2005). Notably, experiments conducted on sighted and congenitally blind participants show that the latter group seems to have spatial and metric visual knowledge of two-dimensional matrices, and to some extent that of three-dimensional matrices as well (Cornoldi et al. 1991).
Although some studies say that task of imagining a visual imagery does not necessitates the activation of early visual cortical area V1 (Bertolo 2005), others claim that V1 seems to be activated during imagining visual imagery tasks (Cattaneo et al. 2009). Thus, it must be said that the involvement of early visual cortex in visual imagery tasks is still unclear. Nonetheless, it seems at least to be certain that associative areas of cerebral cortex are involved in visual imagery tasks. Specifically, it is said that parieto-occipital and temporo-pccipital associative areas are activated in visual imagery tasks (Bertolo 2005). In other words, it seems that neural activities during visual imagery tasks are tied to spatiotemporal functional areas of cerebral cortex.
Notably, it is claimed that neural processes of visual imagery overlaps with that of visual short-term memory (STM), and that this is indeed in disagreement with previous psychological standpoint towards the two phenomena (Cattaneo et al. 2009). This neuroscientific finding suggests that visual imagery is neurologically connected to visual STM. Moreover, as visual STM is subserved by visuospatial working memory (VSWM) (Cattaneo et al. 2009), which contains spatiotemporal working memory of visual cues (Logie 1995), visual STM seems to be consistent with visual imagery task in regard to spatiotemporality of neural information that is to be processed in associative areas. Drawing from the above argument, it can be said that neural activities in high-level perceptual process that involves some mental processes indicate the important deployment of visual association areas, which is tied with visual STM.
To sum up, it was shown that formal elements of visual arts that were introduced in the last chapter (see 1.1.1.) seem to be neurobiologically segregated at earlier stages of visual cognition. Mreover, chromatic perception too can be said as being processed in earlier stages of visual cognitive system in such a way that somehow show an accordance with a formal description of hues that is often employed in formal analysis. Notably, spatial and chromatic information seems to be integrated at later stages of visual cognition. Moreover, spatial and temporal visual information seem to be interrelated at later stages of visual cognition. This indicates the spatiotemporality of visual information in bottom-up perception of visual world. For this, considering the fact that blue/yellow chromatic information is passed through the same neural stream as that spatial information travels through, blue/yellow chromatic dimension can be said as being perceived more spatially than red/green chromatic dimension is, as studies in fact suggest (Kaplan 2003). On the other hand, the key to understanding the influence of top-down perception seems to reside in the associative areas of visual cortex. The similarity of neural processes in visual imagery tasks and visual STM seems to be suggestive of the influence of memory on visual perception. Taking into account the seemingly intrinsic spatial as well as metric visual knowledge of congenitally blind individuals, especially that of two- dimensional than of three-dimensional visual images, it can be said that the existence of somewhat hard-wired visual knowledge might be in effect in top-down perception of visual cognitive system.
Comparing the neurological characteristics of human cognition in auditory and visual perceptions from both bottom-up and top-down directions, hints of achieving cognitive visualisation of musical sounds will be explored in the following. For this purpose, as an example of actual cases of cognitive concoction of different senses including vision and audition, literatures on synaesthetic perceptions are to be introduced at first. In doing so, the following will claim that there are two important cognitive events to be considered for the sake of the objective of the thesis.
(1) Synaesthetic Perceptions
When one experiences an integrated unitary perception that is consisted of sensations from different sensory modalities, the experience is often called multimodal perception. Multimodal perceptions can happen among auditory and visual perception as well, usually together with other sensory modalities, and the issue has lately been catching more and more attentions in the field of cognitive science along with the development of neuroscience. One of the related topics that concern multimodal perceptions is intermodal perception, an example of which is usually called synaesthesia. Synaesthesia can be described as a neurological phenomenon, which occurs when one perceives sensation from a sensory modality that is caused by sensory stimuli of different sensory modality (Campen 2008). For instance, when one perceives scent by looking at an object, the person can be said as having a synaesthetic experience. The number of people who are said to have synaesthesia varies among researches (Day 2007).
Synaesthesia has been the subject not only of scientific but also of artistic inquiries for a long while as well, especially among visual artists and musicians. The reason lies in the fact that synaesthetic experiences could allow some people to experience a cognitive intersection of vision and hearing. Notably, synaesthesia is an involuntary, consistent, and mostly generic, phenomenon (Cytowic and Eagleman 2009). That is to say, synaesthesia occurs to some people in an automatic manner without their intentions, and the characteristics of perceptually associated senses, such as certain sounds and colours, are consistent for them at any given time. Moreover, synaesthesia tends to run in families. Among many types of synaesthesia (Campen 2008), there can be found those which types of them that involve auditory and visual sensory modalities (Day 2007). There have in fact been reported cases of synaesthesia in which auditory perception is induced by visual sensory stimuli. People with this type of synaesthesia report perception of sound while looking at visual images.
Although synaesthesia has often been treated as an illusory phenomena which is allegedly recounted by artists and the like to attract public’s attention or by people who were under the influence of certain psychological or medical conditions, it is now strongly believed that it is a neurologically veritable phenomenon that happens to certain people whose cognitive system shows otherwise normal neurobiological functions. That is to say, studies of neural activities of synaesthetes’ brains during synaesthetic experiences suggest that people who claim to have perceptual experiences of hearing visual images actually perceive sound from images as if the auditory stimuli are present in reality, without any neurobiological eccentricities compared to people without synaesthesia (Campen 2008, Cytowic and Eagleman 2009). In this sense, it could be said that synaesthetic perceptions involve bottom-up perceptual processing of sensory stimuli to some extent, in that synaesthetic experiences are always caused by the presence of actual sensory stimuli, no matter how different sensory modalities the stimuli belong compared to that of perceived sensations. Synaesthesia thus reveals a significant hint for approaching the current topic of the thesis. In fact, it is one of the topics that are often mentioned in literatures on visualisation of sound (Vergo 2010).
There are multiple theories as to why synaesthesia happens to some people and not to the others (Campen 2008, Hubbard et al. 2011, Kolb et al. 2003, Santrock 2007, Sultzbaugh 2009). Regardless of their different theoretical approaches toward the cause of the phenomenon, it is often said that synaesthesia seems to result from an exaggeration of normal brain functions, and therefore can potentially happen to everyone. Linguistic metaphors of different senses, such as “loud shirts” or “smooth sounds,” are often claimed as examples of perceptual phenomena that indicate the possible occurrence of synaesthetic perceptions in non-synaesthetes (Day 1996). In this line, by following the claims of the alleged existence of synaesthetic perceptions in normal human perceptions, the contents of synaesthetic experiences seem to deserve a further exploration.
Nonetheless, although types of synaesthesia can be generalised into limited numbers (Day 2007), the contents of synaesthetic experiences vary among people with the same types of synaesthesia (Campen 2008, Cytowic and Eagleman 2009). For instance, it is shown that two kin synaesthetes who both possess grapheme-colour synaesthesia associate the same letter with different colours (Campen 2008). In other words, irrespective of the causal function of synaesthetic experiences, the contents of them can be said as largely empirical, and therefore seems to be under the effect of conceptual knowledge of individuals. Moreover, the contents of synaesthetic experiences can change depending on the context as well, meaning that the content varies according to whether one focuses on global or local elements (Hubbard et al. 2011). In this sense, it can be said that synaesthetic perceptions are highly probably affected by top-down processing of sensory stimuli as well. Nonetheless, considering that two synaesthetes who have had similar empirical background in a same family can have different contents of the synaesthetic perception of same stimuli, generalising the kinds of conceptual knowledge that characterise the contents of synaesthetic perception seems abortive.
However, a relatively old perceptual experiment that concerns psychological association of sounds with visual images seems to provide a hint of finding somewhat universal cognitive association of visual images and auditory sounds. In an experiment conducted by German psychologist Wolfgang Köhler in 1929, participants were asked to associate sounds of two arbitrarily chosen non-semantic words “takete” and “mauma” with two different visual shapes examples of which are shown below (Figure 10) (Nielsen and Rendall 2011). The result showed a conspicuous coincidence, in that majority of participants associated “takete” with the edgy shape, “mauma” with the round shape. In this sense, although it might be limited to certain cases, some previous psychological studies of human cognition can be said to indicate somewhat universal or intrinsic mental condition that affect perception of visual images in terms of sounds.
illustration not visible in this excerpt
Figure 10. Takete and Mauma. Replicated by Matsumoto, D (2014).
Provided, it can be suggested that the possibility of the perception of auditory sensations from visual sensory stimuli, even though not as conspicuous as that synaesthetes actually experience, can be expected in normal people as well. Drawing from the information provided above, it must be noted that synaesthetic perceptions seem to involve both bottom-up and top-down processing of sensory stimuli in a somewhat integrated manner.
(2) Cognitive Correspondences
As explained earlier, sensations, or more specifically the earliest stages of bottom-up perceptions, are basically modality-specific for each sensory input. Therefore, it seems to be implausible to assume the possibility of visual stimuli to causing auditory bottom-up perception. Nevertheless, as cases of synaesthetic experiences show, there seem to be some cognitive connections between both sensory domains. Indeed, considering neurobiological processes of both auditory and visual bottom-up perceptions, there seems to be a coincidental event to be considered. That is to say, for both auditory and visual perceptions in bottom-up direction, interrelation of spatial and temporal neural information is necessarily involved, especially at earlier stages of perception in cortical areas.
Reviewing the neurolobiological cognitive processes, basic elements of both music and visual arts seem to be to some extent neurobiologically segregated at early stages of bottom-up perception. In these elements are included spatial and temporal elements that were respectively mentioned in the last chapter. However, as being suggested earlier, at later stages of auditory cognition, especially brainstem onward, auditory neural information of both spatial and temporal domains seem to be integrated with each other to cause spatiotemporal perceptions of auditory stimuli. In the same line, visual cognition also involves interrelation of spatial and temporal neural visual information at later stages of visual perception. Indeed, apart from the stark physiological difference of sensory modalities between hearing and vision, neural system of segregations of sensory stimuli into perceived separate elements seem to be relatively similar for auditory and visual bottom-up perceptions (Visscher et al. 2007). Hence, it can be said that spatiotemporal processing of sensory input is analogously required at later stages of neural processes for both auditory and visual cognitions. In this sense, indispensability of spatiotemporal processing of sensory input seems to be coincidental for both music and visual arts cognition.
On the other hand, for top-down direction of neurobiological cognitive processes, both auditory and visual perceptions seem to be affected by neural activities of respective associative areas. For both auditory and visual perception, several studies suggested that neural activities of associative areas activate cortical areas that normally correspond to actual sensory stimuli. In this sense, short-term memory can be said to play an important role of somewhat conceptual knowledge to cognitively penetrate the perception of actual sensory stimuli in auditory as well as visual cognition.
In fact, previous discussions on music and visual cognitions suggest that visual and auditory short-term memory work in a similar way in terms of their neurological mechanisms. Notably, experiments that used substantially different auditory and visual stimuli showed that although initial processing of both stimuli were tremendously different, later processes showed some shared semantic processes that were possibly caused by the effect of short-term memory (Visscher et al. 2007). This fact corroborates the claim that human memory has a fundamental influence on auditory and visual perception of sensory inputs, and possibly on their interactions (Wheeler et al. 2000). Therefore, taking into account the case of visual imagery task for congenitally blind people that suggests some intrinsic visual knowledge, and also the possible cognitive penetration of some conceptual knowledge that has been accumulated as a result of evolution of human being as a species (see 2.1.), influence of stored mental concepts on both auditory and visual perceptions from top-down direction can be said to provide a way of approaching the intersection of music and visual arts cognitions.
Here, several studies provide conspicuous clues to furthering the argument. For bottom-up direction, perceptual processing of spatiotemporal information from sensory stimuli, or, more specifically, proprioceptive perception of movements, seems to play an important role in connecting auditory and visual cognitions. The reason lies in the claim that motion can be perceived in much the same way by majority of people (Moody et al. 2006). Several perceptual experiments, which have been conducted to see how visual stimuli affect auditory perception, used kinaesthetic sensory cues to see a perceptual connection of vision and hearing and vice versa, and showed some mutual affectivities between visual or auditory stimuli and the perception of the other modality (Alais and Burr 2004, Rosenthal and Shimojo 2009, Takeshima and Gyoba 2013, Thoret et al. 2014). Notably, a neurological study reveals that auditory movement perception activates an area of visual cortex that corresponds to perception of visual motion (Poirier et al. 2005). Moreover, a new type of synaesthesia that allows one to hear sounds by looking at moving images has recently been discovered, suggesting the possible link between visual and auditory cognitive domains through motion perception (Saenz and Koch 2008). In this sense, the thesis suggests that motion can possibly provide a way of inducing auditory perception, or in a more obscure sense, auditory sensations, in visual perception.
On the other hand, in terms of conceptual knowledge that affects the content of perceptual experiences, emotion seems to deserve the attention here. A study on children’s emotional perceptions of both facial and vocal expressions of familiar and unfamiliar adults has found out that emotional expressions tend to be perceived in a multimodal manner, with variable preference for one to the other modality (Shackman and Pollak 2005). Notably, it is suggested from an evolutionary perspective that irrespective of empirical knowledge, certain emotional perceptions are possibly consistent universally (Juslin and Laukka 2003). Moreover, in terms of the emotional affectivity of music, it is said that vocal expression and musical expression of emotions share perceptual similarities (Juslin and Laukka 2003, Juslin and Vastfjall 2008). Finally, a recent empirical study shows some consistent emotional connections between perception of music and colours (University of California - Berkeley 2013). In this line, the thesis posits that emotion can be approached as a meeting point of the contents of cognitions of visual arts and music from top-down direction. Considering that motion can be postulated as the other key factor of relating vision and hearing, but from bottom-up direction, the following chapter will therefore focus on these two perceptual events, that are, motion and emotion, in approaching visualisation of musical sounds in graphic images.
Finally in this chapter will be discussed a possible way of cognitively visualising musical sounds. Taking into account the arguments given so far, the aim of this chapter is figure out how to create a visual configuration that might possibly induce auditory cognitive experiences in observers. Therefore, it will not aim to see how to conceptually compose specific musical works in visual images, nor to scientifically translate auditory stimuli into visual stimuli. For this purpose, the thesis will consider the subject of visualization as musical “sounds” in a plural sense, by regarding auditory stimuli of music as being consisted of neurobiologically perceivable musical elements, rather than focusing on material characteristic of sound as a singular sound wave. For this, by considering both spatiotemporal bottom-up processing of sensory stimuli and top-down influence of conceptual knowledge, the chapter intends to finds some connections between cognitive elements of visual arts and music.
In order to achieve this goal, specifically, kinetic and emotional visual cues will be explored in relation to that of musical counterparts. For the part of kinetic cues of visual arts, there can in fact be seen abundant literatures on the incorporation of movement in static graphic images. That is to say, since the word “movement” became somewhat like a maxim in the art world of the modernism era of the early 20th century, as was exemplarily seen in Futurism or Dadaism, many artists and authors have indeed already explored ways of incorporating kinetic cues in static visual mediums. Nevertheless, although it is still highly useful in examining the current topic, the information was usually based on psychological studies of visual perception, and is therefore worth a new perspective by referring to neirophysiological knowledge that have become available relatively recently.
On the other hand, examination of emotional concepts, this time in regard to visual arts and music, seems to necessarily involve empirical perspective in that emotion is largely a subjective experience that depends on physical as well as mental conditions in which one is subjectively situated. Nevertheless, studies on emotional responses of children suggest that some emotional concepts can be said as most probably genetically universal to most of human observers (Juslin and Laukka 2008). Moreover, emotion seems to effectively influence multimodal perception in quotidian events that involve visual as well as auditory stimuli (Shackman and Pollack 2005). Indeed, underlying mechanism of emotional perception has recently been started to be unraveled from a neurological perspective as well (Juslin and Laukka 2008). In this regard, not to mention the emotional affectivity that has been characterising music since antiquity, emotional cues of visual arts will be considered as a cognitive element that evokes cognitively musical experience in the observers.
Consequentially, this chapter will enumerate several points to be considered in an attempt of visualising musical sounds in static graphic images. For this purpose, some graphic examples will be introduced.
As mentioned earlier, the following parts will focus on motion and emotion as conspicuous cognitive events to be considered in a cognitive visualisation of musical sounds. Specifically, motion will be explored as kinetic cues that one can use in static graphic images in reference to that of music, whereas emotion will be attended in regard to emotional visual cues that will connect the contents of perceptual experience of visual arts and music. Therefore, for the former, it is required to see how temporal domain can be integrated into spatial domain of static visual arts. On the other hand, emotional cues must be approached by considering conceptual, or in a sense empirical, knowledge that observers may possess. In this regard, literatures not only on neurophysiological studies of human cognition of movement and emotion in visual arts and music but also on previous phenomenological studies of them will be utilised to some extent. However, it must be noted that the epistemological standpoint of the following argument will be consistent throughout this thesis. In other words, the thesis approaches the topic of this section by focusing on how motion and emotion can affect perceptual experiences of static graphic images and what the contents of the experiences will be.
Notably, as the thesis aims to show a way of visualising musical sounds in static graphic images, kinetic visual cues must not involve actual kinetic movements. For this reason, spatial cues of visual arts bear utmost importance in examining kinetic cues of static visual images in relation to that of music. That is to say, in order to cognitively relate sensory stimuli of spatial domain to that of temporal domain, spatial information that is materialistically presented as a visual image to an observer must be expected to entail perceptually temporal cues as a result of spatiotemporal cognitive processing of visual sensory stimuli. To do so, the following will attend spatiotemporal processing of sensory stimuli in bottom-up perception of them.
On the other hand, emotional cues, or in other words, elements that are tied with emotional conceptual attachments, must be examined together with configurational elements of both visual arts and music. That is to say, since emotional cues are posed here as the influential perceptual event that affects the “contents” of intermodal perceptual experiences of visual images and music, they must be dealt with in parallel to the qualitative as well as quantitative elements of visual arts that will make up the visual configuration, or the content, of an actual visual art work in relation to music. Hence, emotional cues of visual arts will be examined by comparing elements of visual arts and that of music in terms of their emotional connections. For this reason, top-down perception of basic elements of visual arts and music are to be considered, albeit from different perspective than that which is usually taken by a formal analysis of them.
In a static medium of visual arts, kinetic cues must be regarded as perceptual elements that give rise to perceptual experience of kinetic relations in static graphic images. Therefore, characteristics of kinetic cues of visual arts must be examined here as a cognitive device that will possibly induce musical cognitive experience in bottom-up perceptual process. What raises the most daunting problem here is the fact that visual arts of static medium and music reside in materially antagonistic spatiotemporal domains. Indeed, it seems that a study of intermodal perception of vision and hearing often necessitates a discussion on spatiotemporal processing of sensory stimuli in that sensory stimuli of both sense modalities respectively belong to different material domains (Lewkowicz 1999). In the same vein, the following section will explore how spatiotemporal bottom-up perception of visual stimuli can be characteristically compared to temporal domain of music. In doing so, kinetic visual cues that will be useful in a cognitive visualisation of musical sounds are to be postulated.
According to a developmental study of intermodal perceptions of vision and audition, perceptual integration of visual and auditory sensory information in spatiotemporal perception of motion shows that there are at least four temporal cues that an observer can match between separate visual and auditory sensory stimuli (Lewkowicz 1999). Irrespective of the developmental order of the acquisition of the perceptual abilities, they are said as synchrony, duration/synchrony, rate, and rhythm. Although exact cognitive mechanisms behind the intermodal perceptions of the four temporal cues seem to be not yet clarified, psychological studies of visual perception seems to empirically corroborate the validity of the perceptual phenomena that coincide with temporal domain of music (Arnheim 1974). Notably, a study suggests that temporal sensory information that is visually presented is indeed automatically registered and remembered by an observer using auditory codes (Guttman et al 2005).
Nevertheless, it seems that studies on intermodal perception of vision and audition, such that is referred to above, usually use both visual and auditory sensory stimuli in a kinetic situation (Brooks et al. 2006, Rosenthal et al. 2009), and hence cannot be said as necessarily useful in a situational context in which only visual stimuli are present when kinetic perception of them is expected. For this reason, it can be said that temporal visual cues must be examined through the information on neurobiological mechanism of kinetic perception of visual stimuli.
It is said that there are three separate systems in human visual motion perception, which are numerically named as first-order system, second-order system, and third- order system, respectively (Lu and Sperling 2001). Notably, each system functions to perceive visual motion by detecting different characteristics of visual stimuli: first- order system responds to changes of luminance patterns, second-order system detects motion from moving stimuli whose luminance level is constant everyehere but other features such as contrast levels or directions are not, and third-order system responds to the changes of “salience map,” or in other words, the changes of areas that are perceptually attended the most (Dumoulin et al. 2001, Lu and Sperling 2001). Therefore it can be said that, for one thing, luminance levels of visual elements and configurational features of them in a visual moving image are computed separately in perceptual process of kinetic cues. Moreover, for the second, attention can strongly and independently influence the perception of kinetic cues in a moving visual image.
Characteristic differences between the three separate systems can be found in neuroscientific studies of visual motion perceptions. It is shown that whereas first- order system can detect motion at multiple locations in the field of vision, second- order system is not suited for the detection of motion at more than two locations without failing to detect motions at unattended locations (Lu et al. 2000). That is to say, especially when the level of external noise is high, second-order system tends to fail responding to motion at locations that are not attended by the observer. Moreover, detection of motion by second-order system increases its sensitivity at the attended location (Lu et al. 2000).
Importantly, in terms of spatiotemporal perception of kinetic cues, studies suggest that second-order system is biased toward the detection of centrifugal motions (Dumoulin et al. 2001). That is to say, in detections of motion through the changes of configutrational features of visual elements, an observer can more easily detect optical motion that moves from the centre to the edges of the field of vision than that flows in centripetal directions. This may in fact be empirically explained by the fact that ecological situations that humans face in daily lives require them to be accustomed to forward movements, and hence to the centrifugal flow of visual information. In sum, it can be said that whereas changes of luminance level does not seem to show any spatiotemporal bias, that of configurational moving visual elements seems to suggest important spatiotemporal characteristics in terms of cognitive kinetic cues.
It must be noted again that attention seems to affect perceptions of kinetic cues to a large extent in second-order systems. However, it is not only in second-order system but also in third-order system that attention plays an important cognitive role. That is to say, since third-order system is said to detect motion by responding to the changes of the area that is most attended by the observer, attention is indeed the primary origin of third-order motion detective system. Notably, as a result of the motion detection based on the “salience map,” the location that is marked as salient gives rise to the perception of “figure” in opposition to “ground” in a kinetic visual image (Lu and Sperling 1995 cited in Lu and Sperling 2001). For this reason, third-order system can be said as a binocular mechanism that requires perceptions of depth using binocular vision, in contrast to first- and second-order systems that are therefore largely monocular (Lu and Sperling 2001). Therefore, it can be said that third-order system is strongly influenced by the attention of the observer, and also inevitably by physiological movement of the eyes or the head. Indeed, a developmental study of kinetic perceptions of children seems to support this claim (Johnson and Mason 2002).
Neural mechanism of cortical areas in spatiotemporal perception of visual motion also suggests some significant hints for tackling with the current topic. It is said that visual motion processing is primarily conducted in visual cortical areas V3 and V5 (Lewis et al. 2000, Zeki and Lamb 1994). The difference between the functionality of V3 and V5 in visual motion perceptions is that while V5 is mainly activated to detect directional movement of visual stimuli, V3 is more acute to configurational patterns, or in other word, form, of moving visual stimuli (Zeki and Lamb 1994). For this reason, V5 is by and large specialised for perception of visual motion, and is very weakly responsive to static visual stimuli (Zeki and Lamb 1994). Hence it can be guessed that it is because of this temporality-directed feature of V5 that it can be activated by auditory motion processing as well (Poirier et al. 2005). On the other hand, form-oriented feature of V3 in visual motion perception suggest the strong perceptual effects that oriented lines and edges have on the activation of V3 in visual motion processing (Zeki and Lamb 1994).
Moreover, the involvement of form-discrimination task in visual motion processing coincides with the claim that in neural mechanisms of not only visual but also auditory motion perception, rather generalised neural system is utilised (Lewis et al. 2000). Indeed, it was already explained earlier that neural visual information on luminance level and spatial features travel separately through different pathways in general visual perception (see 2.1.2). In the same line, in auditory motion pereption, it is shown that neural pathways that are activated by auditory motion task and pitch- discrimination task closely resemble each other (Lewis et al. 2000). In spite of the lack of knowledge on the exact underlying mechanisms behind them and also on the possible topological overlappings of cortical areas in visual and auditory motion perceptions, it can be thus said that motion perceptions, both visual and auditory, are not a unique neurological phenomenon that is separated from more general, modality-specific perceptions.
The most conspicuous neurobiological fact in visual motion perception is that changes of colours do not affect visual motion perception (Zeki and Lamb 1994). That is to say, in visual motion perceptual mechanism in cortical areas, information on colours are processed primarily in cortical area V4, in contrast to other visual information such as luminance level or configurational features. Considering the fact that colours, or hues, are closely tied to luminance levels in their physical nature, this may suggest an important point. Indeed, V5 is neurobiologically connected to M stream, which is responsible for sending visual information concerning luminance levels but not colours (Zeki and Lamb 1994). In this sense, it seems that visual motion and colours can be separately considered in visual kinetic cues.
Overall, there can be pointed out several important points in enumerating cognitively valid kinetic visual cues. For one thing, perception of directional features of visual elements seems to show centrifugal spatiotemporal bias that is also positively affected by the attention of the observer to increase its sensitivity. For second, perception of shapes in motion is not affected by any unique spatial bias. For third, perception of motion is influenced by changes of luminance levels, but not by hues of visual elements. Considering these neural characteristics in kinetic perception, it can be suggested that (1) a single centrifugal directional configuration of visual elements in macroscopic level will create fundamental temporal direction, (2) visual shapes of various contour lines and edge orientation can create various spatiotemporal cues in microscopic level, (3) use of different luminance levels will increase the temporality in an attended directional configuration, and (4) figure- ground effects at an attended centrifugal direction will increase the kinetic perception of the attended area (Figure 11).
illustration not visible in this excerpt
Figure 11. Kinetic visual cues based on the four characteristics of neural activities in motion perception. Drawn by Matsumoto, D (2014).
Nevertheless, the cognitive characteristics of kinetic visual cues that are mentioned so far are relevant basically to mobile visual stimuli, in that they are generally gained from the studies on visual motion perception of actual kinetic visual stimuli. To solve this dilemma, here, the important role that the observer’s attention plays in visual motion perception must be remembered. As the third-order system of visual motion perception is attention-based system that yields figure-ground perceptual effect by binocular vision, the thesis claims that the movement of the eyes can create kinetic perceptual experience in the perception of static visual images. That is to say, as in a realistic situation the eyes of humans are constantly moving, the changing attention of the observer can give rise to temporal and thus kinetic perception of static visual images. Notably, this may be tantamount to say that attention of the observer can create depth perception, and therefore three-dimensional perceptual effect in static two-dimensional visual images. Indeed, some recent studies on human perceptions seem to corroborate this idea (Egoyan 2011, Egoyan 2011). For this, psychological studies of visual perception more or less seem to support this claim as well (Arnheim 1974). Moreover, it is reported that neural processing of perceptually implied dynamic, kinetic information from static images activates cerebral regions that are usually involved in motion perception (Kourtzi and Kanwisher 2000). In this sense, it can be supposed that inducing kinetic perception to an observer by using static visual stimuli is possible.
This being said, it is stipulated here that the above-mentioned temporal cues can be legitimately, though probably less effectively, used in static graphic images. Therefore, summing up the characteristics of kinetic visual cues pointed out so far, it can be said that by considering the importance of attention of the observer at less than two locations, which is the minimum threshold of causing perceptual loss to perception of kinetic cues created by visual configurational elements, the use of noise in a graphic image can be said to enhance the perception of attended visual elements set in a centrifugal direction. Also, figure-ground perceptual effects created by various luminance levels of shapes can help to emphasise kinetic cues in the attended area of a static graphic image. For the temporal visual configurational elements, the four temporal rhythmic cues that were mentioned earlier in this section can be said as being useful. Use of colours can be said to bear no relevant importance in kinetic cues here.
Empirical knowledge seems to show that emotion is a largely subjective, context- dependent experience. Therefore, understanding emotional cues in visual arts may necessitate a phenomenological perspective to classify the contents of emotional concepts that are based on individual subjective experiences. On the other hand, it was relatively recently that underlying mechanisms of emotional experience or perception has started to be explained through neurobiological findings. In accordance with this, this section will introduce general knowledge on emotional perception available from studies of neurobiological mechanism behind how emotions are processed in human brains. Next, in order eventually to ascertain visual emotional cues that will conceptually coincide with musical emotional cues, emotional cues that are characteristically observed in music will be mainly dealt with in what follows. For this purpose, previous phenomenological researches on emotional cues in music will be utilised. Moreover, an implication of how to translate auditory emotional cues into that of graphic images will be provided, by considering the emotional contents of the perception of them.
It seems that the number of studies of neural mechanism behind emotion perception has been increased in the past decades. One of the common scientific views toward neurobiological characteristics of emotion perception is that processing of emotions involves functions of some fundamental cerebral structures that overlap with general cognitive system, namely limbic and paralimbic areas (Davidson 2000, Davidson and Sutton 1995, Koelsch 2005, Kreutz and Lotze 2007). Notably, studies on synaesthesia often point out the involvement of limbic and paralimbic areas in synaesthetic perceptions as well (Campen 2008). These cerebral structures are said as an essential region of a human brain that has undergone conspicuous enlargement in the evolutionary process of mammalian species, and therefore show an ecological importance for survival in various physical as well as mental situations. Irrespective of the exact neural system of emotion perceptions, that are indeed said as being separable to multiple mechanisms (Juslin and Vastfjall 2007), it is said that emotions arise as a result of activation of responsible cerebral areas, which then affect one’s physiological states such as the rate of heartbeat, skin conductance response (sweat), or eye blinks (Kreutz and Lotze 2007). Nevertheless, the interaction between emotion and physiological response is bi-directional, as emotions can result from hard-wired cognitive interpretations of one’s physiological states as well (Kreutz and Lotze 2007).
Notably, it is often claimed that emotions that are induced by music, or in other words, musical emotions, are basically the same types of emotions that humans perceive in their daily lives from various sensory stimuli, including auditory stimuli such as vocal expressions of people (Juslin and Vastfjall 2008, Koelsch 2005, Kreutz and Lotze 2007, Panksepp and Bernatzky 2002). Therefore, neurobiological studies of emotion in general can be inferred to be applicable to emotional cues of music. In terms of emotion in general, the thesis supports the claim of “basic emotions,” that are said as being innate and universal emotions from which all other emotions are derived (Juslin and Laukka 2003: 771). Basic emotions are consisted of happiness, anger, sadness, fear, and love. Studies show that each of the different kinds of emotions, or in a more general categorisation of positive (happiness, love) and negative (anger, fear, sadness) emotions, seems to be processed by the brain more or less in a discrete manner (Juslin and Laukka 2003, Kreutz and Lotze 2007). Emotional cues that induce these basic emotions in the listener can therefore be said as being necessary to be examined.
However, it is said that perception of emotion that is caused by the same sensory stimuli can be inevitably affected by one’s empirical knowledge as well, such as one’s familiarity to the stimuli or cultural codes and the like (Juslin and Vastfjall 2007, Panksepp and Bernatzky 2002). In this sense, especially by taking into consideration the paucity of exact knowledge on the neural mechanisms behind emotion perceptions of specific situational context, phenomenological studies of emotions and music must be utilised. By reviewing a phenomenological study that has gleaned information on emotional cues and formalist musical elements from a large number of miscellaneous literatures (Juslin and Laukka 2003), the following table (table 2) shows some important correspondences between some musical elements and the five basic emotions mentioned above. Note that only basic, important elements of music are chosen in the table. Since the above-mentioned study lists emotional cues of both vocal and musical expressions, the thesis excluded some elements to focus only on musically relevant elements, by considering formal analysis of them (Justus and Bharucha 2002, Kerman 1996, Levitin 2007, Martineau 2008, Nudds 2007, Schmidt-Jones 2008).
illustration not visible in this excerpt
Table 2. Emotional cues of music according to formal musical elements and five basic emotions, taken from a study (Juslin and Laukka 2003) and revised by Matsumoto, D (2014). The cues are separated here into (1) frequency organisation in time and (2) time organisation.
According to the table, one is more or less able to find out the characteristics of musical elements that are usually used in types of music that evoke five basic emotions and, by inference, other derivative emotions. Nonetheless, it must be borne in mind that all the characteristics listed above are quantitatively attained and broadly categorised, therefore may not be applied in strict and peculiar instances. Necessity of further research on emotional cues of music in neurobiological studies must therefore be suggested here.
Next, emotional cues that can be used in visual arts, or specifically, in graphic images, must then be explored in relation to that of music. It is said that visual imagery can in fact be counted as a possible way of emotion processing mechanisms in that emotions that are perceived from music by a listener can result from visual imagery that are conjured up in his or her mind in perceptual response to music listening (Juslin and Vastfjall 2008). Moreove, several studies verify the inherent connections between mental visual imagery and emotion perception (Holmes and Mathews 2005). In this sense, visual imagery, which was shown to be able to activate visual cortical areas that respond to actual visual sensory stimuli (see 2.1.2.), seems to suggest a speculative importance in the cognitive translation of elements of music to that of visual arts via emotion perceptions.
One of the most familiar ways of studying emotion perception of visual stimuli deals with perception of emotion from human facial expressions (Johnson 2005, Shackman and Pollak 2005). In comparison of emotion perception between facial and vocal expressions, it seems that one’s sensitivity to either visual or auditory stimuli in emotion perception is affected by his or her empirical knowledge as well as developmental states (Shackman and Pollak 2005). In fact, irrespective of developmental influence which bear no immediate importance to the discussion here, the effect that one’s experience have on emotion perception of music via visual imagery seems to be high (Juslin and Vastfjall 2008). On the other hand, it is reported that there are some recurrent themes in visual imagery that are associated with music, such as nature scenes or unworldly experiences (Osborne, cited in Juslin and Vastfjall 2008). This information suggests somewhat innate tendency in the association of visual imagery and music. Nevertheless, in the lack of neuroscientific knowledge on neural associative mechanisms between visual and auditory stimuli, phenomenological literature seems to be the most relevant sources to be utilised here again.
Drawing from previous studies on psychological perception of broad range of visual images, there are various visual elements that are usually associated with certain emotional characteristics (Arnheim 1974, Choi et al. 2007, Itten 1970, Lu et al. 2012). In order to avoid a semantic perspective, and also by considering neurobiological adequacy of perceiving geometrical figures in bottom-up perception of visual stimuli (see 2.1.2.), the thesis lists only a few basic characteristics of visual shapes, together with colours, in the following table. Specifically, by taking into account the importance of directional and edge element in motion perception, only some spatial configuration of visual images such as contour configuration, surface configuration, in addition to aspects of light, are to be listed. Since the number of literatures on colours is significantly higher than that on visual shapes or forms as such in terms of emotion, emotional cues of colours must be ascribed with more reliability. Notably, only four out of the five basic emotions that are in accordance with emotional cues of music are listed, due to the limitation on the literature resources.
illustration not visible in this excerpt
Table 3. Emotional cues of visual arts according to basic visual cues and four basic emotions, taken from a study (Choi et al. 2007) and revised by Matsumoto, D (2014) in reference to other literatures (Arnheim 1974, Itten 1970, Lu et al 2012). The cues are separated here into (1) spatial configuration and (2) colour (light). “???” is used for sections for which no available data was found.
As the topic has already been extensively discussed in preceding psychological studies, no detailed information on psychological mechanisms behind them will be provided here. In terms of emotional cues in regard to spatial configuration, only those which characteristics that concern general two-dimensional visual elements are provided. However, it is not to say that only abstract shapes are to be regarded as emotional visual cues, but to simplify the aspects of visual images that would innately matter in their emotional characteristics. Namely, considering the material characteristics of visual elements of two-dimensional mediums (see 1.1.1.), contour line and edge of shapes are listed in the table. Nevertheless, as there is yet no enough evidence to corroborate the idea of perceptual correspondence between musical elements and spatial configurations of visual images, the emotional connections must still be regarded as a speculation.
On the other hand, no specific hues and the like are intimated for emotional cues in regard to colours (light). Indeed, attached concepts or emotional codes of hues can substantially differ among various cultures as well as context (Itten 1970, Gage 1993). Therefore, emotional cues of hues are listed here in a rather general manner. However, it must be suggested that by concentrating on specific socio-cultural situations, one can acquire more concrete and probably effective emotional cues of visual arts.
Nevertheless, an important and seemingly highly common connection between some musical emotional cues and a property of colour (light) seems to be found. That is to say, there seems to be highly probable perceptual matching of frequency configuration of sound and quantitative configuration of light. As the emotional categorisations in the tables above show, pitch height and melody line (pitch contour) seem to correspond with value of light. Indeed, some psychological studies show that high pitches tend to perceptually correspond with bright light, and low pitches with dark light (Collier and Hubbard 2001, Cousineau et al. 2014). Moreover, it is said that pitch contour as well as intervals also affect the association of sounds with value, as ascending pitches tend to be associated with brighter light and descending pitches with darker light, besides small intervals with brighter light and large intervals with darker light. Taking into account correspondence between the perceptual tendency of associating frequency configuration of sounds and quantitative configuration of light and their emotional categories, then, value can be said as a useful tool for visualising musical sounds.
In sum, it can be said that, although no specific visual emotional cues that correspond to discrete musical emotional cues were elucidated, general types of emotional cues that can be used in graphic images with respect to music can be speculated by consulting the respective information provided in the tables above. Namely, perceptual correspondence between frequency configurations of sounds and quantitative lighting configurations of visual images, or roughly speaking that between pitch and value, seem to corroborate the emotional correspondences between them listed in the tables. There were no available evidences to back up the correspondences of other elements between music and visual arts. Elucidation of more detailed, and neuroscientifically adequate emotional cues of visual arts that correspond to that of music must wait until further development of the study of emotion perception in visual arts and music from neurological perspective will bring about more consistent and detailed data.
Taking into consideration all the information given by the discussions so far, this section will note some recommendations regarding a cognitive visualisation of musical sounds in static graphic images, using some graphic examples. In doing so, since the focus is now on the process of visualising musical sounds in graphic images in contrast to the possibility of cognising musical sounds in them, the recommendations will be told in the language of formal analysis of visual arts. In other words, as the visualisation of musical sounds concerns the act of creation than that of reception, how to visually “represent” coincidental auditory cues in a graphic image must be explored by using a formalist, abstract language. However, the formal visual elements are treated in epistemic respect to the kinetic as well as emotional cues of visual arts and music that are mentioned in the last sections. Methodologically, kinetic cues and emotional cues are treated separately in the following.
(1) Kinetic Cues
Incorporation of kinetic cues into graphic images has been sought by myriad of visual artists who tried to imply motion in physically static graphic images. At the extreme end of this attempt might be the visual images that are created to rather explicitly induce perceptually illusory kinetic effects to perceivers, which are often called Op Art or Optic Art. Nevertheless, works of Op Art traditionally pay attention to psychological effects of various visual elements such as geometrical shapes or colour theory (Inglis 2012). Therefore, the approach usually used in Op Art seems to be not applicable in finding exact kinetic cues that will readily be useful in inducing auditory experiences to observers of visual images through kinetic perceptual experiences. Nevertheless, a computational study of neural activities of human brains in perception of illusory visual images shows that perception of works of Op Art give rise to certain involuntary eye movements and therefore can activate motor detecting regions of early visual system (Zanker 2004). In this sense, it seems that geometrical compositions of spatial elements that are used in works of Op Art can be utilised in seeking kinetic cues in static graphic images to some extent.
Nonetheless, perceived kinetic movement of most works of Op Art does not seem to expedite the use of the findings from the last sections. Indeed, the perceived movement of visual elements in Op Art usually possesses no specific directional movement in a microscopic level (i.e. see Figure 12). Therefore, kinetic cues used in works of Op Art are not clear enough to show the accordance with the neural penchant of human visual system to attentively perceive centrifugal motion of less than two locations. In truth, it is said the geometrical compositions of visual elements do not necessarily overlap the directional movement of eye-movement, as the latter of which varies considerably at given times (Zanker 2004). Therefore, it must be said that, although being a useful guidance, spatial configurations of visual elements must not simply be based on psychological as well as neurobiological effects of kinetic perception, in that there is needed a specific global directional cue to grab the attention of the observer that will induce effectively attended kinetic perceptual experiences.
illustration not visible in this excerpt
Figure 12. Current, Bridget Riley 1964
In order to attract one’s attention at a directional location in the picture plane of a graphic image, there must be somewhat engaging element that induces directional eye movement. Therefore, considering the close bond between kinetic perception and the perceptual effect of figure-ground relationship (see 3.1.1.), creating a conspicuous location that engages the movement of the eye must be suggested.
From a neurological perspective, attention can be explained both from pre-attentive and post-attentive processing of visual sensory stimuli, and each of the two corresponds to either bottom-up or top-down processing of sensory stimuli in human cognition respectively. Notably, the former concerns the assumption that some elementary visual features such as orientation, colour, and movement, are processed involuntarily in bottom-up perceptions that result in automatic attention towards those features (Meur et al. 2004). The visual cues that will grab one’s attention in pre-attentive processing can indeed be said as more or less in accordance with some principles of a psychological theory on human perception that is often exploited by many visual artists (Arnheim 1974). That is to say, to follow the suggestions boiled down by a computational study of attention modelling, (1) visibility, (2) emphasis, and (3) contour enhancement in an elongated configuration can be useful in creating an attentive area in a picture plane (Meur et al. 2004) (Figure 13). Note that these three visual cues are already redundantly mentioned in psychology-based study of visual perception, and the demarcations between them are rather unclear (Arnheim 1974). Combining these three visual cues with spatiotemporal characteristics of kinetic cues (see 3.1.1.), one will be able to create a global direction in a picture plane in which kinetic perception of visual elements is expected.
illustration not visible in this excerpt
Figure 13. Three visual cues for attention enhancement. Drawn by Matsumoto, D (2014).
Nevertheless, in contrast to pre-attentive process, attentive process that is influenced by top-down processing of sensory stimuli can also affect a perceiver’s attention. Notably, it is said to have stronger effect on attraction of one’s attention than pre- attentive process does (Meur et al. 2004). However, a recent study shows the affectivity of one process over the other changes depending on the content of visual images (Massaro 2012). Nonetheless, it can be at least suggested that by utilising emotional cues of visual arts in relation to that of music, which are effective conceptual elements in top-down perceptual process of cognitive penetration, one can attempt to strengthen the kinetic cues, by at the same time creating the contents of musical cognitive experiences in a graphic image.
(2) Emotional Cues
In order to enhance a perceiver’s attention towards a certain area of a picture plane, and also to create cognitive content of the graphic image, emotional cues must be employed. It was suggested in the last section that cognitive correspondences in perceiving emotional cues of visual arts and music could be found between frequency configuration and quantitative organisation of light (see 3.1.2.). On the other hand, spatial configurations, together with hues of visual elements, were not suggested to show any significant consistency in regard to the basic elements of music. Drawing information from these findings, emotional cues of visual arts, which either correspond to that of music or not from a neurobiological perspective, are separately listed in the following tables (table 4, 5). In the first table are shown those which visual emotional cues that show some correlations with musical emotional cues, whereas in the latter are shown visual cues that might induce some basic emotions in a perceiver. Notably, since visual cues that will induce the emotions of love were not provided in the last section, the table deal only with the other four types of basic emotions.
illustration not visible in this excerpt
Table 4. Correspondences between visual emotional cues and musical emotional cues. Quantitative characteristics of light (colour) that correspond to basic formal element of music are shown (see 3.1.2.). Anger (A), fear (F), sadness (S), and Happiness (H) and listed. Drawn by Matsumoto, D (2014).
illustration not visible in this excerpt
Table 5. Visual emotional cues that are expected to induce basic emotions. Spatial configurational elements as well as qualitative characteristics of light (colour) are shown. The visual cues are drawn by Matsumoto, D (2014) and are based on psychological studies of emotion perception of visual images (Arnheim 1974, Choi et al. 2007, Itten 1970, Lu et al. 2012).
In table 4, by taking into account the material characteristics of light (colours) (see 1.1.1.), saturation is assigned with more or less the same characteristics as value is. That is to say, since both value and saturation of light (colour) concern quantity of light in reflected surface of a picture plane, in contrast to hues of light that are instead qualitative characteristics, visual cues in terms of value are applied in a same manner to that of saturation. Moreover, as the basic elements of musical sounds that are accounted in a formal analysis can in fact be explained as frequency configuration of a single sound from a material perspective (1.2.1.), other emotional cues of music are ascribed with visual cues by considering their material characteristics. In table 5 are the visual cues that are expected to induce some basic emotions in a perceiver. Although it was mentioned that certain spatial configurations show highly consistent correspondences with certain sounds (see 2.2.), the former are only suggested to correspond to certain emotions here, but not directly to musical cues. Notably, since no specific emotional characteristics of hues that are consistently associated with specific emotions were proposed, hues of visual elements as emotional cues must be exploited by taking into consideration the socio- cultural conceptual attachment of emotions to specific hues. Nevertheless, general characteristic of hues are provided, using a formal colour scheme.
Finally, by consulting the two tables, it can be said that one can create a visual configuration that will cognitively induce musical cognitive experiences in an observer. For instance, it can be analysed that the examples of graphic image shown below (figure 14.) can be expected to induce musical cognitive experiences that correspond with four basic emotions respectively. It must be noted here that even though the four examples show music-related context, they are chosen arbitrarily irrespective of their commercial contents in that semantic considerations are utterly ignored in this argument.
illustration not visible in this excerpt
Figure 14 (a). Die Zauberflote, A poster for opera of W. A. Mozar by Wieslaw Rosocha, 1994
illustration not visible in this excerpt
Figure 14 (b). A poster for a movie Black Swan by LaBoca, 2011
illustration not visible in this excerpt
Figure 14 (c). La Triviata, a poster for 7th International Triennial of Stage Poster Sofia by Jisuke Matsuda, 2013
illustration not visible in this excerpt
Figure 14 (d). A CD cover for Classical Delight, Lesley Spencer, 2004
Remembering the characteristics of visual emotional cues, figure 14 (a) can be regarded as an example of graphic images that may induce musical cognitive experiences of angry emotion in an observer. There can be recognized a global confugurational flow of visual elements in a vertical direction, which are accompanied with zigzag, edgy contour lines. Texture seems to be unglossy, which correspondes with visual emotional cue of anger. Moreover, luminance level and contrast of the attended area and its peripheral seem to accord with the characteristics of musical angry emotional elements, such as ascending melody and sharp duration contrast. Since the luminance level and contrast of the attended area are bright and subtle, it can be said that the graphic image hitches the attention of the observer in the cantankerous melody line that is characteristic of music of angry emotions.
Figure 14 (b) is provided as an example of graphic images that is expected to induce musical cognitive experiences of fearful emotion. The global directional configuration goes to right up side surrounded by elongated edgy shapes, and curved line meets the eye of an observer in the middle and at the end. The luminance contrast of the peripheral of the attended area is set high, creating figure-ground effect that can enhance kinetic perceptual experiences. These visual cues are in accordance with corresponding musical emotional cues such as high dynamics, large timing variability, and sharp duration contrast. Moreover, the contour lines of typography show sharp edges, and also its sharp luminance contrast with the background is in relation to emotional cues of fearful music.
Figure 14 (c) is an example of graphic images that may present musical cognitive experience of sadness to an observer. Although global direction of visual elements seems to be vague compared to the other examples, the elongated contour lines suggest vertical flow of visual elements. Nevertheless, an inspection of each contour line reveals that the graphic image is consisted of contour lines of various directions, thus can be said to be kinetically dynamic. Achromatic hues of low luminance contrast accord with characteristics of visual emotional cues in relation to that of music. That is to say, since the luminance contrast of achromatic visual elements in the image is gradual and relatively subtle, it can be said to correspond to low dynamic, low variability, slow tempo, or soft duration contrast of the emotional cues of sad music.
Lastly, figure 14 (d) can be said as an example of graphic images that are tied with emotional cues of happy music. Most of the contour lines of visual elements are presented as elongated curved lines, and the global direction of visual elements seems to be horizontal. These visual cues correspond with emotional visual cues of happiness. Relative luminance level of the image itself is bright, and luminance contrast of visual elements does not show sharp differences in general. Low value of the attended global location indicates that the focal area induces perceptual experiences of musical emotional cues such as ascending melody and low dynamics. Considering these characteristics of light (colour) of visual elements, the graphic image can be said to correspond with musical emotional cues of happy music.
Above all, the thesis concludes that taking into consideration all the claims posited so far, one is able to approach, albeit in a general way, a cognitive visualisation of musical sounds in static graphic images. Specifically, whereas quantitative configuration of light as an element of visual arts seems to show some neurobiological correspondences that are based on the correspondences of visual and auditory cognition, the other visual elements must generally be elucidated using traditional psychological information. Importantly, as is always true in any design principles, however, the suggestions that the thesis has provided in regard to an organisation of visual elements in a visualisation of musical sounds is not to be strictly followed by a visual artist, designers inclusively, but is to be exploited as far as possible in an actual visual art work. That is to say, the kinetic and emotional cues of visual arts that have been claimed to be in accordance with that of music so far are expected be used as guidance for an attempt of cognitively visualising musical sounds in graphic images.
Overall, the previous discussions have elucidated, in a rather general manner, some significant connections between the cognition of visual arts and music that will be useful in visualising musical sounds from an objective perspective. By considering those connections, the thesis has proposed some visual cues to achieve a cognitive visualisation of musical sounds, or more specifically, to create a graphic image that is aimed at inducing musical perceptual experiences in a perceiver. It seems that kinetic elements of graphic images hold important role of inducing musical perceptual experiences in beholders. Moreover, it is suggested that cognitive correspondences between pitches, or specifically the height and contour of them, and value of light/colour enable matching of visual and musical emotional cues by considering materialistic aspects of both light and sound.
Nevertheless, it must be noted that the thesis does not claim to have specified any radical but readily practical principles of visual arts that will possibly replace today’s dominant design theories that are commonly based on psychological principles of visual perception. On the contrary, it has laid down an incipient theory of visual organisation that is based on a qualitative analysis of cognitive connections between visual arts and music, which the thesis has explored so far. The reason for this theoretical conclusion lies in the facts that, for one thing, the information available from neuroscientific studies of visual and auditory cognition is still limited, and, for the second, not only neuroscientific but also phenomenological perspective must be utilised in order to achieve tangible unification of visual arts and music.
It was mentioned in the formal and material comparison of visual arts and music that the notion of representation must utterly be discarded from the both art forms for the sake of a material objectivity. Nevertheless, the comparison of cognition of visual arts and music that followed suggested the neurobiological validity of recognising formal elements of both visual arts and music to some extent in human perceptual process. Moreover, it was mentioned that some neurobiological findings accorded with previous psychological studies of human perceptions to some extent. Therefore, in a cognitive visualisation of musical sounds as such, a creator must nevertheless “represent” visual kinetic as well as emotional cues, which are in fact the presentation of light from a material perspective, in whichever medium he is working on. In this sense, the objectivity that this thesis has sought can be defined as neither that of realism nor idealism that characterise current dominant view on human cognition.
Indeed, it doest not seem to be cognitively adequate to negate the notion of representation in that, as the last chapter showed, what is claimed as needed for a cognitive visualisation of musical sounds is the integration of some neuroscientific perspective and phenomenological knowledge. Hence, unless recent upheaval of neuroscience will finally overturns the empirical adequacy of phenomenological perspective toward human cognition in a more thorough manner, deterministically objective and universal theory of cognitive visualisation of musical sounds cannot be posted. In this sense, one must keep a vigilant eye on forthcoming studies on not only human visual and auditory cognitions but also on miscellaneous attempts of visualising musical sounds as such.
Alais, D. and Burr, D. (2004). The Ventriloquist Effect Results from Near-Optimal Bimodal Integration. Current Biology, Vol. 14, pp. 257-262. DOI 10.1016/j.cub.2004.01.029
Allman, J., and Zucker, S. (1990). Cytochrome Oxidase and Functional Coding in Primate Striate Cortex: a Hypothesis, Cold Spring Harbor Symposia on Quantitative Biology, 55. pp. 979-982.
Alves, B. (2005). Digital Harmony of Sound and Light. Computer Music Journal, 29:4, pp. 45-54.
Arnheim, R. (1974). Art and Visual Perception: A Psychology of the Creative Eye. Berkeley and Los Angeles, CA: University of California Press.
Bertolo, H. (2005). Visual Imagery without Visual Perception? Psicologica, 26, pp. 173-188.
Bermúdez, J. (2011). Nonconceptual Mental Content. In Stanford Encyclopedia of Philosophy. http://plato.stanford.edu/entries/content-nonconceptual/ (Retrieved May 8, 2014).
Blacking, J. (1973). How Musical Is Man? Seattle: University of Washington Press.
Brassier, R. (2007). Nihil Unbound: Enlightment and Distinction. New York, NY: Palgrave Macmillan.
Brooks, A. et al., (2006). Auditory Motion affects Visual Biological Processing. Neuropsychologia xxx (2006) xxx-xxx, DOI:10.1016/j.neuropsychologia.2005.12.012
Bryant et al., ed. (2011). Speculative Turn: Continental Materialism and Realism. Melbourne, Australia: re. press.
Bullier, J. (2003). Communications Between Cortical Areas of the Visual Syatem. In Chalupa, L. M. and Werner, J. S. (Eds.). (2003). The Visual Neurosciences. Cambridge, MA: MIT Press.
Byrne, A. (2004). Perception and Conceptual Content. In Sosa, E. and Steuo, M. (Eds. ). (2005). Contemporary Debates in Epistemology. Oxford: Blackwell.
Campen, C. V. (2008). The Hidden Sense: Synesthesia in Art and Science. London: MIT Press.
Casagrande, V. A. and Xu, X. (2003). Parallel Visual Pathways: A Comparative Perspective. In Chalupa, L. M. and Werner, J. S. (Eds.). (2003). The Visual Neurosciences. Cambridge, MA: MIT Press.
Cattaneo, Z., et al. (2009). Contrasting Early Visual Cortical Activation States Causally Involved in Visuai Imagery and Short-Term Memory. European Journal of Neuroscience, Vol. 30, pp. 1393-1400.
Chandler, D. (2002). Semiotics: The Basics. New York, NY: Routledge.
Chalmers, D. J., et al. (1991). High-Level Perception, Representation, and Analogy: A Critique of Artificial Intelligence Methodology. Center for Research on Concepts and Cognition, CRCC Technical Report 49.
Choi, Y. S., et al. (2007). A Study on the Expression of Emotions using Lights in Apparel Types. International Symposium on Ubiquitous VR 2007. ISBN: 978-1- 59593-862-6.
Churchland, P. (1989). A Neurocomputational Perspective: The Nature of Mind and the Structure of Science: MIT Press.
Collier, W. G. and Hubbard, T. L. (2001). Musical Scales and Brightness Evaluations: Effects of Pitch, Direction, and Scale Mode, Am J Psychol, 114(3), pp. 355-75. doi:10.1177/102986490400800203.
Cornoldi C., et al. (1991). Individual Differences in the Capacity Limitations of Visuospatial Short-Term Memory: Research on Sighted and Totally Congenitally Blind People. Memory & Cognition, 19(5), pp. 459-68.
Cousineau, M., et al., (2014). What is Melody?: On the Relationship Between Pitch and Brightness of Timbre. Oriental Research Article, Vol, 7. Article 127. DOI: 10.3389/fnsys.2013.00127.
Cox, C. (2011). Beyond Representation and Signification: Toward a Sonic Materialism. Journal of Visual Culture, 10(2), pp. 145-61. DOI: 10.1177/1470412911402880.
Cristia, C. (2012). On the Interrelationship between Music and Visual Art in the Twentieth and Twenty-first Centuries: A Possible Typology Derived from Cases Originated in Argentinean Artistic Field. Transcultural Music Review, 16.
Crow, D. (2003). Visible Signs. Switzerland: AVA Publishing.
Cytowic, R. E. and Eagleman, D. M. (2009). Wednesday is Indigo Blue: Discovering the Brain of Synesthesia. Cambridge, MA: MIT Press.
Davidson, R. J. (2000). Affective Neuroscience and Psychophysiology: Toward a Synthesis. Psychophysiology, 40, pp. 655-665.
Davidson, R. J. and Sutton, S. K. (1995). Affective Neuroscience: The Emergence of a Discipline. Current Opinion in Neuropsychology, 5, pp. 217-224.
Day, S. (1996). Synaesthesia and Synaesthetic Metaphors. Psyche, 2(32).
Day, S. A. Synesthesia. (n.d.). http://www.daysyn.com/index.html. (Retrieved May 9, 2014).
Deleuze, G. (2003). Francis Bacon: The Logic of Sensation. Tran. Daniel W. Smith. London: Continuum.
Delgutte, B. et al. (2007). Neurl Coding and Auditory Perception. RLE Progress Report 149. Chapter 14. http://www.rle.mit.edu/media/pr149/14.pdf (Retrieved May 8, 2014).
Dikovitskaya, M. (2005). Visual Culture. London: The MIT Press.
Dumoulin, S. O., et al. (2001). Centrifugal Bias for Second-Order but not First-Order Motion. J. Opt. Soc. Am. A/Vol. 18, No. 9. pp. 2179-2189. OCIS codes: 330.4150, 330.4270, 330.5510, 330.7310.
Egoyan, A. (2011). Elastic Membrane Based Model of Human Perception. National Center for Disease Control and Public Health, Tbilisi, Gerogia.
Egoyan, A. (2011). Smooth Infinitesimal Analysis Based Model of Multidimensional Geometry. Ilia Chavchavadze State University, Tbilisi, Gerogia.
FitzGerald, K and VanderLans, R. (2010). Volume: Writings on Graphic Design, Music, Art, and Culture. New York, NY: Princeton Architectural Press.
Gage, J. (1993). Colour and Culture: Practice and Meaning from Antiquity to Abstraction. London: Thames and Hudson.
Galembo, A. et al. (2004) Perceptual Significance of Inharmonicity and Spectral Envelope in the Piano Bass Range. Acta Acoustica, 90, pp. 528-36.
Gardner, E. P. and Martin, J. H. (2000). Coding of Sensory Information. In Kandel, E. R. et al. (Eds.). Principles of Neural Science 4th edition, New York, NY: McGraw Hill. pp. 425-429.
Gibson, J.J. (1972). A Theory of Direct Visual Perception. In J. Royce, W. Rozenboom (Ed.). The Psychology of Knowing. New York: Gordon & Breach.
Gregory, R. L. (Ed.). (1987). The Oxford Companion to the Mind. Oxford: Oxford University Press.
Griffiths, T. D. (2003). The Neural Processing of Complex Sounds. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Guttman, S. E., et al. (2005). Hearing What the Eyes See: Auditory Encoding of Visual Temporal Sequences. Psychol Sci, 16(3), pp. 228-235. doi:10.1111/j.0956- 7976.2005.00808.x.
Haazebroek, P. and Hommel, B. (2009). Towards a Computational Model of Perception and Action in Human Computer Interaction. Lecture Note in Comuoter Science, Vol. 5620, pp 247-256.
Halliwell, J. J. et al. (1996). Physical Origin of Time Asymmetry. Cambridge: Cambridge University Press.
Halpern, A. R. (2003). Cerebral Substrates of Musical Imagery. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Hecht, E. (2014). Optics. 4th edition. Essex: Pearson Education Limited.
Hodgkinson, T. (1997) An Interview with Pierre Schaeffer: Pioneer of Music Concrete. ReR Quarterly Magazine, 2(1).
Holmes, E. A. and Mathews, A. (2005). Mental Imagery and Emotion: A Special Relationship? Emotion, Vol.5, No.4, pp. 489-497. DOI: 10.1037/1528-35126.96.36.1999.
Hubbard, E. M., et al. (2011). Cross-Activation Theory 10. Journal of Neuropsychology 5, pp. 152-177.
Huggins, E. R. (1999). Physics 2000. Etna, New Hemisphere: Moose Mountain Digital Press.
HyperPhysics. (n.d.). Nave, C. R. http://hyperphysics.phy- astr.gsu.edu/hbase/hph.html (Retrieved 29 January, 2014)
Inglis, T. C. (2012). Op Art Rendering with Lines and Curves. Computers & Graphics, Vol. 36, Issue 6, pp. 607-21.
Itten, J. (1970). The Elements of Colour. Birren, F. (ed.). Hagen, V. (Trans.). Ravensburg, Germany: John Wiley and Sons, Inc.
Jarvelainen, H. et al. (2001). Audibility of the timbral effects of inharmonicity in stringed instrument tones. Acoustics Research Letters Online, 2(3), pp. 79-84.
Johnson, S. P. and Mason, U. (2002). Perception of Kinetic Illusory Contours by Two-Month-Old Infants. Child Development, Vol.73, No.1, pp. 22-34.
Johnson, M. H. (2005). Subcortical Face Processing. Nature, October 2005, Vol.6, pp. 776-774.
Juslin, P. N. and Laukka, P. (2003). Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code? Psychological Bulletin, Vol.129, No.5, pp. 770-814.
Juslin, OP. N. and Vastfjall, D. (2008). Emotional Responses to Music: The Need to Consider Underlying Mechanisms. Behavioural and Brain Sciences, 31, pp. 559-621. DOI:10.1017/S0140525X08005293.
Justus, T. C. and Bharucha, J.J. (2002). Music Perception and Cognition. In Yantis, S. (Volume Ed.) and H. Pashler (Series Ed.). (2002). Stevens’ Handbook of Experimental Psychology, Vol.1: Sensation and Perception (Third Edition, pp. 453- 492). New York: Wiley.
Kandinsky, W. (1977). On the Spiritual in Art. New York, NY: Dover Publications
Kandinsky, W. (1979). Point, Line to Plane. New York, NY: Dover Publications
Kaplan, E. (2003). The M, P, and K Pathways of the Primate Visual System. In Chalupa, L. M. and Werner, J. S. (Eds.). (2003). The Visual Neurosciences. Cambridge, MA: MIT Press.
Kepes, G. (1995). Language of Vision. New York, NY: Dover Publishers.
Kerman, J. (1996). Listen. New York, NY: Worth Publishers.
Klein, I., et al. (2000). Transient Activity in the Human Calcarine Cortex During Visual-Mental Imagery: An Event-Related fMRI Study. Journal of Cognitive Neuroscience, 12, Supplement 2, pp. 15-23.
Kolb, B. et al., (2003). Brain Plasticity and Behavior. Current Directions in Psychological Science, Vol.12, No.1.
Knauff, M., et al. (2000). Cortical Activation Evoked by Visual Mental Imagery as Measured by fMRI. Cognitive Neuroscience and Neuropsychology, Vol.11, No.18, pp. 3957-3962.
Koelsch, S. (2005). Investigating Emotion with Music: Neuroscientific Approaches. Ann. N.Y. Acad. Sci, 1060, pp. 1-7. DOI: 10.1196/annals.1360.034.
Kourtzi, Z. and Kanwisher, N. (2000). Activation in Human MT/MST by Static Images with Implied Motion. Journal of Cognitive Neuroscience, 12:1, pp. 48-55.
Kreutz, G. and Lotze, M. (2007). Neuroscience of Music and Emotion. In Gruhn, W. and Rauscher, F. (Eds). (2007). New York: Nova Science Pub Inc.
Krumhansel, C. L., and Toiviainen, P. (2003). Tonal Cognition. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Lamb, W. E. Jr. (1995). Anti-photon. Applied Physics B, 60, pp. 77-84.
Lam, M. M. et al. (2013). The Interplanetary Magnetic Field Influences Mid-Latitude Surface Atmospheric Pressure. Environmental Research Letters, 8.
Levitin, D. (2007). This Is Your Brain on Music: Understanding a Human Obsession. London, Great Britain: Atlanta Books.
Lewis, J. W., et al. (2000). A comparison and Visual and Auditory Motion Processing in Human Cerebral Cortex. Cerebral Cortex, 10, pp. 873-888. 1047- 3211/00.
Lewkowicz, D. J. (1999). The Development of Temporal and Spatial Intermodal Perception. In Aschersleben, G., et al. (Eds.). Cognitive Contributions to the Perception of Spatial and Temporal Events. Amsterdam: Elsevier Science B.V.
Liegeois-Chauvel, C. et al. (2003), Intracerebral Evoked Potentials in Pitch Perception Reveal a Functional Asymmetry of Human Auditory Cortex. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Liegeois-Chauvel, C., et al. (1999). Specialization of left auditory cortex for speech perception in man depends on temporal coding. Cereb. Cortex 9, 484-96.
Logie, R. H. (1995). Visuo-Spatial Working Memory. Hove, ES: Lawrence Erlbaum Associates Publishers.
Lotz, C. (2009). Representation or Sensation?: A Critique of Deleuze’s Philosophy of Painting. Sympsium. Canadian Journal for Continental Philosophy 13 (1), pp. 59- 73.
Loudon, R. (2003). What is a photon? In Optics Society of America. (2003). OPN Trends,supplement to Optics & Photonics News,Vol.14, No.10.
Lu, X. et al. (2012). On Shape and the Computability of Emotions. Proceedings of the ACM Multimedia Conference, 10 pages, Nara, Japan: ACM, October 2012.
Lu, Z. L. and Sperling, G. (2001). Three-Systems Theory of Human Visual Motion Perception: Review and Update. J. Opt. Soc. Am. A/Vol. 18, No. 9, pp. 2331-2370.
Lu, Z. L., et al. (2000). Attention Mechanisms for Multi-location First- and SecondOrder Motion Perception. Vision Research 40, pp. 173-186. PII: S0042- 6989(99)00172-8.
Macpherson, F. (2012). Cognitive Penetration and Nonconceptual Content. Draft for Zeimbekis, J. and Raftopoulos, A. (Eds.). Cognitive Effects on Perception: New Philosophical Perspective.
Martineau, J. (2008). The Elements of Music. In Wooden Books Ltd. Quadrivium: The Four Classical Liberal Arts of Number, Geometry, Music & Cosmology. New York, NY: Walker Publishing Company Inc.
Massaro, D., et al. (2004). When Art Moves the Eyes: A Behavioral and Eye- Tracking Study. PLoS ONE 7(5): e37285. doi:10.1371/journal.pone.0037285. Meillassoux, Q., trans. Brassier , R. (2008). After Finitude: An Essay on the Necessity of Contingency. New York, NY: Continuum International Publishing Group.
Mesulam, M. M. (1998). From Sensation to Cognition. Brain, 121. pp. 1013-1052.
Meur, O. L., et al. (2004). From Low Level Perception to High Level Perception, a Coherent Approach for Visual Attentdple.irisa.fr/Olivier.Le_Meur/publi/LeMeur_HVEI04.pdf (Retrieved May 9, 2014).
Moody, N., et al. (2006). Motion as the Connection between Audio and Visuals. International Computer Music Conference Proceedings. http://quod.lib.umich.edu/cgi/p/pod/dod-idx/motion-as-the-connection-between- audio-and-visuals.pdf?c=icmc;idno=bbp2372.2006.085. (Retrieved May 9, 2014).
Morris, C. G. and Maisto A. A. (2004). Psychology: An Introduction. New Jersey. Prentice Hall.
Nevid J. S. (2009). Psychology: Concepts and Applications. Boston, MA: Houghton Mifflin Company.
Newsome, W. T. (1988). A Selective Impairment of Motion Perception Following Lesion of the Middle Temporal Visual Area (MT). J Neurosci 8, pp. 2201-2211.
Nielsen, A. and Rendall, D. (2011). Sound of Round: Evaluating the Sound- Symbolic Role of Consonants in the Classic Takete-Maluma Phenomenon. Canadian Journal of Experimental Psychology Vol. 65, No. 2, pp. 115-124.
Nudds, M. (2007). Auditory Perception and Sounds (Draft). Edinburgh Research Archive. http://hdl.handle.net/1842/1775. (Retrieved 25th April, 2014).
Panksepp, J. and Bernatzky, G. (2002). Emotional Sounds and the Brain: the Neuro- Affective Foundations of Musical Appreciation. Behavioural Processes 60, pp. 133- 155.
Parsons, L. M. (2003). Exploring the Functional Neuroanatomy of Music
Performance, Perception, and Comprehension. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Pallasmaa, J. (2005). The Eyes of the Skin: Architecture and the Senses. 2nd edition. West Sussex: John Wiley and the Sons Ltd.
Peretz, I. and Kolinsky, R. (1993). Boundaries of Separability between Melody and Rhythm in Music Discrimination: a Neuropsychological Perspective. Quarterly J. Exp. Psychol. 46A, 301-25.
Poirier, C., et al. (2005). Specific Activation of the V5 Brain Area by Auditory Motion Processing: An fMRI Study. Cognitive Brain Research 25. pp. 650 - 658.
Raftopoulos, A. (2009). Cognition and Perception: How Do Psychology and Neural Science Inform Philosophy? Cambridge, MA: Massachusetts Institute of Technology.
Rauschecker, J. P., and Tian, B. (2000). Mechanisms and Streams for Processing of “What” and “Where” in Auditory Cortex. Proceedings of the National Academy of Science, vol. 97. No. 22, pp. 11800-11806.
Rhode, W. S. and Recio, A. (2001). Basilar-Membrane Response to Multicomponent Stimuli in Chinchilla. Acoustical Society of America, 110 (2), pp. 981-994.
Richards, R. J. (1976). James Gibson’s Passive Theory of Perception: A Rejection of the Doctrine of Specific Nerve Energies. Philosophy and Phenomenological Research, Vol. 37, No. 2, pp. 218-233.
Rockwell, T (2001). Experience and Sensation: Sellars and Dewey on the Non- Cognitive Aspects of Mental Life. Education and Culture. Vol. 17, Issue 1.
Rosen, A., and Rosen, D. (2006). The Design of a Sensation-Generating Mechanism in the Brain: A First Step Toward a Quantitative Definition of Consciousness. Consciousness and Cognition International Journal, ASSC e-print archive. http://www.theassc.org/files/assc/ea_sgm_rosen_subm.pdf (Retrieved May 9, 2014).
Rosenthal, O., et al. (2009). Sound-induced flash illusion is resistant to feedback training Brain Topography, 21, pp. 185-192
Saenz, M and Koch, C. (2008). The Sound of Change: Visually-Induced Auditory Synesthesia. Current Biology, Vol. 18, Issue. 15, pp. R650-R651. Santrock, J. (2011). Essential of Life Span Development. New York, NY: McGraw Hill.
Schaeffer, P. (2004). Acousmatics. In Cox, C. and Warner, D. (Ed.), Audio Culture: Readings in Modern Music. New York, NY: Continuum International Publishing Group Inc.
Schmidt-Jones, C. (2008). The Basic Elements of Music. Scribd. URL: http://ja.scribd.com/doc/11238578/The-Basic-Elements-of-Music (Retrieved January 29, 2014)
Scruton, R., ed. Kenny, A. (1997). Oxford Illustrated History of Western Philosophy. Oxford: Oxford University Press.
Shackman, J. E. and Pollak, S. D. (2005). Experiential Influences on Multimodal Perception of Emotion. Child Development, Vol. 76, No.5, pp. 1116 - 1126.
Skoe, E. and Kraus, N. (2010). Auditory Brain Stem Response to Complex Sounds: A Tutorial. Ear and Hearing 2010, 31, 1.
Sontag, S. (1961). Against Interpretation and Other Essays. New York, NY: Farrar, Straus and Giroux.
Sultzbaugh, J. S. (2009). Chromoacoustics: The Science of Sound and Color. The Rose+Croix Journal. Vol. 6.
Swets, J. A. (1961). Is There a Sensory Threshold? Science, New Series. Vol. 134, Issue. 3473, pp. 168-177.
Szmajda, B. A., et al. (2008). Retinal Ganglion Cell Inputs to the Koniocellular Pathway. The Journal of Comparative Neurology. 510, pp. 251-268. Takeshima, Y. and Gyoba, J. (2013). Complexity of Visual Stimuli Affects Visual Illusion Induced by Sound. Vision Research 91, pp. 1-7.
Thoret, E., et al. (2014). From Sound to Shape: Auditory Perception of Drawing Movements. Journal of Experimental Psychology: Human Perception and Performance. Advance online publication. DOI:10.1037/a0035441.
Tillmann, B. et al. (2003). Learning and Perceiving Musical Structures: Further Insights from Artificial Neural Networks. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Tombran-Tink, J. and Barnstable, C. J. (eds). (2008). Visual Transduction and NonVisual Light Perception. Totowa, NJ: Humana Press.
Tramo, M. J. et al. (2003). Neurobiology of Harmony Perception. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Tucker, T. R. and Fitzpatricl, D. (2003). Contributions of Vertical and Horizontal Circuits to the Response Properties of Neurons in Primary Visual Cortex. In Chalupa, L. M. and Werner, J. S. (Eds.). (2003). The Visual Neurosciences. Cambridge, MA: MIT Press.
Turvey, M. T. (1975). Perspectives in Vision: Conception or Perception? Status Report on Speech Research, SR-42/43.
Ungerleider, L. G. and Pasternak, T. (2003). Ventral and Dorsal Cortical Processing Streams. In Chalupa, L. M. and Werner, J. S. (Eds.). (2003). The Visual Neurosciences. Cambridge, MA: MIT Press.
University of California - Berkeley. (2013). Bach to the Blues, Our Emotions Match Music to Colors. ScienceDaily. Retrieved May 9, 2014 from http://www.sciencedaily.com/releases/2013/05/130516151256.htm
Vergo, P. (2005). That Divine Order: Music and the Visual Arts from Antiquity to the Eighteenth Century. New York, NY: Phaidon Press Limited.
Vergo, P. (2010). The Music of Painting: Music, Modernism ad Visual Arts from the Romantics to John Cage. New York, NY: Phaidon Press Limited.
Visscher, K. M., et al. (2007). Auditory Short-Term Memory Behaves Like Visual Short-Term Memory. LoS Biol 5(3): e56. DOI:10.1371/journal. pbio.0050056.
Wallschlaeger, C., and Busic-Snyder, C. (1992). Basic Visual Concepts and Principles for Artists, Architects and Designers. New York, NY: McGraw Hill.
Wheeler, M.E., et al. (2000). Memory’s Echo: Vivid Remembering Reactivates Sensory-Specific Cortex. Proc. Natl Acad. Sci. USA, 97, pp. 11125-11129.
Zajonc, A. (2003). Light Reconsidered. In Optics Society of America. (2003). OPN Trends,supplement to Optics & Photonics News,Vol.14, No.10.
Zanker, J. M. (2004). Looking at Op Art from a Aomputational Viewpoint. Spatial Vision, Vol.17, No.1-2, pp. 75-94.
Zatorre, R. J. (2003). Neural Specializations for Tonal Processing. In Peretz, I. and Zatorre, R. J. (Eds.). (2003). The Cognitive Neuroscience of Music. Oxford, OX: Oxform Unversity Press.
Zeki, S. (1993). The Visual Association Cortex. Current Opinion in Neurobiology. 3, pp. 155-159.
Zeki, S., and Shipp, S. (1988). The Functional Logic of Cortical Connections, Nature, 335, pp. 311-317.
Zeki, S. and Lamb, M. (1994). Neurology of Kinetic Art. Brain, 117, pp. 607-636.
Zelanski, P. and Fisher, M. (2005) The Art of Seeing. 6th edition. New Jersey: Prentice Hall.
Wissenschaftliche Studie, 9 Seiten
Hausarbeit, 15 Seiten
Masterarbeit, 93 Seiten
Seminararbeit, 28 Seiten
Examensarbeit, 43 Seiten
Hausarbeit (Hauptseminar), 18 Seiten
Hausarbeit (Hauptseminar), 10 Seiten
Hausarbeit (Hauptseminar), 19 Seiten
Diplomarbeit, 146 Seiten
Doktorarbeit / Dissertation, 192 Seiten
Hausarbeit (Hauptseminar), 32 Seiten
Wissenschaftliche Studie, 9 Seiten
Seminararbeit, 28 Seiten
Examensarbeit, 43 Seiten
Diplomarbeit, 146 Seiten
Doktorarbeit / Dissertation, 192 Seiten
Hausarbeit (Hauptseminar), 32 Seiten
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!