High-level neural similarity predicts perceptual competition during encoding of different object categories

Cohen, Konkle, Rhee, Nakayama & Alvarez

Faces, scenes, objects, and bodies evoke distinct but overlapping neural activation patterns in the ventral stream when presented in isolation. If these stimulus categories are presented simultaneously in the visual field, how do they compete for perceptual resources? Is the degree of competition between stimulus categories predicted by the similarity of their individual neural responses?

Participants' task was to detect changes between two successively presented displays, each containing four items. These items either came from the same category (e.g. four faces) or a mixture of two categories (e.g. two faces and two scenes). For each category pairing, we compared change detection performance on same-category and mixed-category trials. Overall we found that performance was significantly better for mixed-category than same-category trials, where the size of effect (Cohen's D) depended on the pair of categories tested: F/S=1.24; B/S=1.18; B/F=.97; B/O=.93; F/O=.4; O/S=-.13.

Next, we used a blocked fMRI design to obtain activation patterns for each stimulus category for images presented in isolation. Neural similarity was calculated by computing the average Euclidean distance in beta-weights across voxels within independently localized regions of interest (ventral stream excluding V1/V2 and category selective voxels, category selective voxels, areas V1/V2, and dorsolateral prefrontal cortex, DLPFC). We observed a significant (p<.05) correlation between the degree of perceptual competition found in the behavioral experiment and neural similarity in the ventral stream (R2=.5) and within category selective regions (e.g. FFA+PPA, etc.; R2=.47), but not significant in low-level regions (V1/V2. R2=.03) or a non-visual region (DLPFC, R2=.005).

These results show perceptual competition depends on the categories being encoded, and is predicted by the neural similarity of those categories in high-level cortex. These results suggest that the ability to represent multiple objects concurrently is limited by the extent to which representing those objects depend on separate underlying neural resources.




Cohen, M., Konkle, T., Rhee, J., Nakayama, K., & Alvarez, G. A. (2012). High-level neural similarity predicts perceptual competition during encoding of different object categories. Talk presented at the annual meeting of the Vision Sciences Society, May 11-16, Naples, FL.