What can crossmodal aftereffects reveal about neural representation and dynamics?
Konkle & Moore
The brain continuously adapts to incoming sensory stimuli, which can lead to perceptual illusions in the form of aftereffects. Recently we demonstrated that motion aftereffects transfer between vision and touch. Here, the adapted brain state induced by one modality has consequences for processes in another modality, implying that somewhere in the processing stream, visual and tactile motion have shared underlying neural representations. We propose the adaptive processing hypothesis--any area that processes a stimulus adapts to the features of the stimulus it represents, and this adaptation has consequences for perception. This view argues that there is no single locus of an aftereffect. Rather, after- effects emerge when the test stimulus used to probe the effect of adaptation requires processing of a given type. The illusion will reflect the properties of the brain area(s) that support that specific level of representation. We further suggest that many cortical areas are more process-dependent than modality-dependent, with crossmodal interactions reflecting shared processing demands in even 'early' sensory cortices.
Konkle, T. & Moore, C. I. (2009). What can crossmodal aftereffects reveal about neural representation and dynamics? Communicative and Integrative Biology. 2(6), 1-3.