This study provides an explanation about a general concept structure underlying inductive reasoning that enables context-dependent knowledge selection. The similarity that determines inductive strength in an argument is considered to be based on features of entities in the propositions. It is thought that, instead of exploiting all possible features, contexts in a proposition select a certain set of those features. However, no general theory about concept structure has been proposed that enables such a context-dependent feature selection. In the present study, we focus on the syntactic dependency structure of propositions in an argument that determines the relationship of entities to the predicate in those propositions. We also focus on the issue that the statistical analysis of such a relationship reveals the concept structure underlying inductive reasoning, in which nouns are arranged by the semantic roles in verbs that could describe these nouns (a verb-centric concept structure). Such a concept structure enables the specification of the appropriate knowledge to be induced, and the similarity based on that selected knowledge determines the inductive strength. We explained the mechanism by a computational model with data from a statistical analysis of syntactic dependency. The model simulation predicts some phenomena that are examined and validated by empirical studies. Finally, we discuss the issue that the domain-inclusive verb-centric concept structure that enables the context-dependent knowledge selection underlies inductive reasoning.