The brain must identify objects from different viewpoints that change the retinal image. This study examined the conditions under which the brain spends computational resources to construct view-invariant, extraretinal representations in a 3D virtual environment. We focused on extraretinal representation of visual symmetry. Visual symmetry activates the extrastriate visual cortex and generates an Event Related Potential (ERP) called Sustained Posterior Negativity (SPN). Amplitude is lower for symmetrical compared to asymmetrical stimuli at posterior electrodes. Given a symmetric pattern on a plane, regularity in the retinal image is degraded by perspective. Previous studies have found that the SPN is selectively reduced for perspective symmetry – we term this perspective cost. This cost may be reduced when sufficient visual cues are available to support view invariance. To explore this, we used a VR-based 3D environment. Forty-eight participants completed 2 tasks, discriminating stimulus regularity (symmetry or asymmetry) and discriminating stimulus luminance (light or dark). We computed perspective cost as the difference between frontoparallel and perspective SPN. In the Regularity task, perspective cost was significantly <.35 μV – our a priori definition of a small SPN modulation, indicating no perspective cost. The results from the Luminance task were less clear. SPN cost was not significantly more than 0 μV, but not significantly less than .35 μV. We conclude that the extrastriate cortex can construct extraretinal representations of symmetry when sufficient visual depth cues are available. This certainly happens during regularity discrimination and may happen automatically during luminance discrimination.
KeywordsSymmetry
Virtual reality
Sustained posterior negativity
EEG
ERPs
© 2025 The Author(s). Published by Elsevier Ltd.
Comments (0)