Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope -...

4

Click here to load reader

description

A ‘script’ as PDF of my presentation at the Listening symposium, PARCNorthWest, International Anthony Burgess Foundation, UK on 17th November 2011.

Transcript of Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope -...

Page 1: Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope - 17-11-11

‘The Eye Hears, The Ear Sees’ Listening symposium 17-11-11 The Augmented Tonoscope by Lewis Sykes http://www.augmentedtonoscope.net

The title I came up with for this presentation on listening - ‘The Eye Hears, the Ear Sees’ - seemed to fit what I wanted to talk about. But it turns out, unbeknownst to me, to also be the title of a 1970 documentary on the career of hand-drawn filmmaker, Norman McLaren - an artist I much admire. Apart from this introduction I make no further mention of this happy coincidence with McLaren’s practice of creating sound by drawing directly into the optical soundtrack of film.

Through my research, The Augmented Tonoscope, I’m exploring fundamental aspects of the relationship between sound and image in audiovisual work - particularly within the context of artistic practise described by Visual Music. I’m building a hybrid analog and digital instrument that will visualise sound and vibration through its analog in visual form - the geometric modal wave patterns of Cymatics. Since this will probably mean very little to most of you I’ll show a video of my latest studio experiments...

[show practice video - http://vimeo.com/38350669]

By linking sound and image in such a direct and elementary way I’m hoping to find an audiovisual amalgam that engages the viewer in a subtlety shifted way - a synchronisation between the senses of sight and hearing that results in a ‘co-sensing’ of a ‘co-expressiveness’ - where the mind is not doing two separate things, it’s doing the same thing in two ways.

This has led me to look at sensory modality and cognitive neuroscience - how we attain an awareness of the world around us by interpreting sensory information.

Recent studies indicate that our senses aren’t distinct - they interact with and influence one another all the time. So in my presentation I’m planning to conduct a couple of simple yet surprisingly robust perceptual experiments - the results persist even when you’re aware of the method - that demonstrate how listening is actually an interaction between both hearing and seeing - and that under different circumstances one sense can dominate the other. I caveat this by saying that although you are all Guinea pigs in my experiment these are not ideal laboratory conditions and I can’t guarantee results.

But I’ll start with this illustration by Robert Fludd, from his book Secundus De Supernaturali (1619).

Robert Fludd was a prominent English astrologer, mathematician, cosmologist and Qabalist. In this influential diagram describing the process of perception as used in the study of perception, consciousness and psychology, the faculty of the imagination is seen mediating between the senses and the reason. Fludd wanted to explain the nature of the perceptible world as being classified into the four realms of the sensual, imaginable, intellectual and sensible - and the diagram shows the interplay and connection between the different psychological faculties.

Utriusque cosmi maioris scilicet et minoris […] historia, tomus II (1619), tractatus I, sectio I, liber X, De triplici animae in corpore visione.

Page 2: Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope - 17-11-11

I’m using this illustration to remind us that it really wasn’t so long ago that the likes of Robert Fludd, despite living in the late C16th and early C17th, prior even to the accepted start of the Age of Enlightenment in the latter part of the C17th, saw themselves as both philosophers and scientists and drew little distinction between these disciplines in trying to explain the world around them.

Like Fludd, I’m interested in an almost alchemic approach to research - not through the specialism of a single discipline but through broad, diverse and by contemporary standards at least, potentially conflicting lines of enquiry.

So in this spirit, because I thought no-one else was likely to take this standpoint and just for the sake of being contrary, I plan to approach ‘Listening’ from a more scientific perspective - by illustrating how contemporary cognitive neuroscience is trying to reveal how information from multiple senses is integrated within the brain.

I’m trying to make a modest, but I think significant point about perception - how our brains interpret our senses - and how interactions between our senses affect what we perceive. That before we even begin to cognitively process the perceived world around us, the complex internal mechanisms of our brains have already adjusted our perception of it. It seems that we sense the world through a set of evolutionary developed filters and processes that pre-empt our cognitive faculties and have nothing to do with memory, knowledge, experience or culture.

When we perceive images and sounds occurring coincidentally, either in space or in time, they are often bound together to form a multi-modal percept - a multi-sensory object of perception. Evolution has designed this innate multi-sensory integration to take advantage of situations where different senses can provide information on the same object, a particular example being the way we perceive speech - we hear the spoken language but this is facilitated by viewing the accompanying movement of the speaker’s lips and facial expressions.

If you’re anything like me and years of overly loud music has damaged your hearing, then you should be familiar with having to concentrate on someone’s lips when the background noise of a busy restaurant, pub or bar makes it difficult to hear quite what they’re saying. This is multi-sensory integration at work - the detection of either a weak auditory or visual stimuli improves drastically when it occurs in unison with another more ‘reliable’ sensory modality. But the issue for neuroscientists in circumstances like these, when the vision and sound are in agreement i.e. when the stimulation is congruent, is that while integration between the senses undoubtedly occurs, it’s difficult to ascertain the precise mechanism of the integration.

In contrast, when the information presented to multiple modalities is incongruent, i.e. the senses don’t agree with each other, we often experience distinct perceptual illusions. In fact it’s these circumstances that enable experiments which can relate our perception to one sense modality more than another, which in turn affords a stronger test for hypotheses on the mechanism of sensory integration. So what I’m going to try with you are a couple of perceptual experiments - multi-sensory illusions that far from being mere curiosities, provide an excellent opportunity to test these various hypotheses.

First the McGurk (& MacDonald) Effect (1978) where viewing an incongruently articulating face categorically alters the auditory speech percept. In non neuroscience jargon, I’ll show you a short film of me enunciating different phonemes, but in some cases I’ve edited the film so that you are hearing and seeing two different phonemes coincidentally. You’ll experience the visual sense dominating and changing what you hear. The robustness of this effect, i.e. the results persist even when you’re aware of the method, means that I can tell you this before you see it and it won’t make any difference. This effect is in neuroscientific terms, significantly different from certain optical illusions, which break down once you ‘see through’ them.

[show McGurk (& MacDonald) Effect video - http://vimeo.com/39137357 - and ask for reactions]

The second experiment is a more recent one by Shams et al (2000). I’m going to show you a series of nine short audiovisualisations in which you’ll see from one to three quick successive visual flashes on the screen and simultaneously hear from one to three audio beeps. You have to note ONLY how many flashes you see in each instance of the series. But unlike the McGurk (& MacDonald) Effect I’m not going to be quite as cavalier in describing the point of the experiment before you experience it.

Page 3: Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope - 17-11-11

[show Double Flash Effect video - http://vimeo.com/39138252 - and ask for reactions. Only really interested in the last two audiovisual events where a single flash and double beep and then a double flash and triple beep occur - viewers should experience a ‘fission’ of the visual flash]

Shams et al. reported on this new multi-sensory illusion in a paper, Visual illusion induced by sound, in 2002. A single flash is perceived as two when two rapid tone beeps are presented concurrently. This perceptual fission of a single flash due to multiple beeps is not matched by a perceptual fusion of multiple flashes due to a single beep. Shams et al. concluded that the second beep caused subjects to perceive an illusory flash. What’s interesting about this experiment is that audition dominates vision, which is rare - more often, as in the McGurk effect, our primary sense, vision, dominates audition.

So what’s going on in both these experiments? Why is one sense dominating over another? There are at four hypotheses of multi-sensory integration I’ve come across, that each advocate a condition for the dominance of one sense over another. 1. According to the discontinuity hypothesis, the sense in which stimulation is discontinuous i.e. the most

broken up, dominates;2. The modality appropriateness hypothesis states that the sense more appropriate for the task at hand

dominates;3. The information reliability hypothesis claims that the sense providing the more reliable information

dominates;4. and the directed attention hypothesis states that the sense in which there is most competition between

multiple stimuli and so which requires the most attention is dominant.

In the conclusion to a subsequent paper, Tobias S. Andersen et al’s (2004) Factors influencing audiovisual fission and fusion illusions - an experiment using stimuli very similar to those used by Shams et al. - the researchers suggest that all of the multi-sensory integration hypotheses should be considered as factors which contribute to the relative dominance of each sense and shouldn’t be considered all-or-nothing conditions.

So what’s the point of all this? Are there useful principles for artists in the audiovisual neuroscience and psychology literature? Nick Collins (personal communication by email, 2 November 2011) wrote to me of such sources in an exchange following our presentations at the Seeing Sound 2 symposium, “...there are some interesting things, just don’t be deceived that because they mention innumerable parts of the brain and drop lots of psychology refs, that the whole thing is solved; They’re sometimes wrangling over quite small experimental details. Artists may seek much more explicit large scale behavioural rules than scientists can with confidence supply.”

I’d obviously like “much more explicit large scale behavioural rules” that could inform my practice and research… but what I’m finding instead are markers and hints towards creative possibilities. The question is whether I can somehow leverage these ideas within my experimental audiovisualisations to create small yet thoughtful variations at the beginning of the creative process that might effect larger, more significant changes by the end. This approach suits me since I want to try and build serendipity - “the art of making an unsought finding” - into my research, to uncover the latent potentialities within my practice by opening myself to the possibility of the ‘unfound’.

Hopefully it will help me to answer intriguing questions such as:• Is it possible to produce creative work crafted in such a way that the mind is not doing two separate

things - it’s doing the same thing in two ways.• Can a careful synchronisation between the senses result, not in a subordination of one over another, but

in a ‘co-sensing’ of a ‘co-expressiveness’.

Page 4: Listening Symposium - 'The Eye Hears, The Ear Sees' - Lewis Sykes - The Augmented Tonoscope - 17-11-11

References

Andersen T.S. et al (2004), Factors influencing audiovisual fission and fusion illusions, Cognitive Brain Research, Vol. 21, Issue 3, pp. 301–308

Collins N. & d’Escriván J. (Eds.) (2007), The Cambridge Companion to Electronic Music, Cambridge University Press

Hankins T.L. & Silverman R.J. (1995), Instruments And The Imagination, New Jersey: Princeton University Press

Kastner S. et al (1998), Mechanisms of Directed Attention in the Human Extrastriate Cortex as Revealed by Functional MRI, Science, Vol. 282 No. 5386 pp. 108-111, 2 October - http://www.sciencemag.org/content/282/5386/108.abstract [accessed 10 November 2011]

Macdonald J. & McGurk H. (1978), Visual influences on speech perception processes, Perception & Psychophysics, Vol. 24 (3), pp. 253-257

McNeill, D. (1992), Hand and Mind: What Gestures Reveal about Thought, Chicago, The University of Chicago Press

Shams L., Kamitani Y. & Shimojo S. (2000), Illusions: What you see is what you hear, Nature, Vol. 408, Dec. 14, p 788

Shams L., Kamitani Y. & Shimojo S. (2002),Visual illusion induced by sound, Cognitive Brain Research, Vol. 14, Issue 1, pp. 147–152

and useful, albeit non-academic references:

Is Seeing Believing?, BBC, Horizon (2010-11), http://www.bbc.co.uk/programmes/b00vhw1d [accessed via YouTube - http://www.youtube.com/watch?v=G-lN8vWm3m0&feature=related - 10 November 2011]

The Eye Hears, the Ear Sees (1970), IMDb, http://www.imdb.com/title/tt0200625/ [accessed 10 November 2011]

McGurk effect, Wikipedia, http://en.wikipedia.org/wiki/McGurk_effect [accessed 10 November 2011]

Perception, Wikipedia, http://en.wikipedia.org/wiki/Perception [accessed 10 November 2011]

Categorical perception, Wikipedia, http://en.wikipedia.org/wiki/Categorical_perception [accessed 10 November 2011]

Neuroplasticity, Wikipedia, http://en.wikipedia.org/wiki/Neuroplasticity [accessed 10 November 2011]

Stimulus onset asynchrony, Wikipedia, http://en.wikipedia.org/wiki/Stimulus_onset_asynchrony [accessed 10 November 2011]

Robert Fludd, Wikipedia, http://en.wikipedia.org/wiki/Robert_Fludd [accessed 10 November 2011]

Age of Enlightenment, Wikipedia, http://en.wikipedia.org/wiki/Age_of_Enlightenment [accessed 10 November 2011]

percept - definition according to Dictionary, British Dictionary, OS X 10.7.2percept |ˈpəːsɛpt|nounPhilosophyan object of perception; something that is perceived.• a mental concept that is developed as a consequence of the process of perception.ORIGIN mid 19th cent.: from Latin perceptum ‘something perceived’, neuter past participle of percipere ‘seize, understand’, on the pattern of concept