摘要:We investigated how attention to a visual feature modulates representations of other features. The feature-similarity gain model predicts a graded modulation, whereas an alternative model asserts an inhibitory surround in feature space. Although evidence for both types of modulations can be found, a consensus has not emerged in the literature. Here, we aimed to reconcile these different views by systematically measuring how attention modulates color perception. Based on previous literature, we also predicted that color categories would impact attentional modulation. Our results showed that both surround suppression and feature-similarity gain modulate perception of colors but they operate on different similarity scales. Furthermore, the region of the suppressive surround coincided with the color category boundary, suggesting a categorical sharpening effect. We implemented a neural population coding model to explain the observed behavioral effects, which revealed a hitherto unknown connection between neural tuning shift and surround suppression.