Prototype theory of categorization and category learning assumes that a category is simply represented by its central tendency. The theory accounts for many psychological phenomena associated with categories, yet it is shown to be incapable of accounting for some important aspects of categories and concepts. For example, Prototype theory, due to its simplistic representation, cannot describe how people make inference about variabilities and correlations among feature dimensions within categories. In addition, it cannot learn categories that are not linearly separable. The present research extends Prototype theory of category learning in order to improve its explanatory capability while maintaining the simplistic representation mechanism. Our theory assumes that a category is not only represented by its central tendency but also by an abstracted within-category structure. In order to evaluate its descriptive validity, we developed a computational model built on the basis of the theory. In our model, called STRAP for STRucture Abstracing Prototype, a central tendency is represented by a mean vector (i.e., centroids) and an abstracted within-category structure by a covariance matrix. Three simulation studies were conducted and the results showed that STRAP successfully accounted for empirical phenomena that have not been replicated by existing prototype models: it acquired knowledge that is necessary for making inferences about variabilities and correlations among feature dimension within categories; it learned to categorize linearly non-separable categories; it reproduced A2 advantage, which is a tendency that people categorize a less “prototypical” stimulus A2 more accurately than more “prototypical” stimulus, invalidating some criticisms against Prototype theory. More important, STRAP and thus our theory accounts for these psychological phenomena with distinctive cognitive information processes, as compared with those of other successful models, providing new insights into how categories are represented in our mind.