期刊名称:The Prague Bulletin of Mathematical Linguistics
印刷版ISSN:0032-6585
电子版ISSN:1804-0462
出版年度:2019
卷号:112
期号:1
页码:55-82
DOI:10.2478/pralin-2019-0002
出版社:Walter de Gruyter GmbH
摘要:Graph theory, which quantitatively measures the precise structure and complexity of any network, uncovers an optimal force balance in sentential graphs generated by the computational procedures of human natural language (CHL). It provides an alternative way to evaluate grammaticality by calculating ‘feature potential’ of nodes and ‘feature current’ along edges. An optimal force balance becomes visible by expressing ‘feature current’ through different point sizes of lines. Graph theory provides insights into syntax and contradicts Chomsky’s current proposal to discard tree notations. We propose an error minimization hypothesis for CHL: a good sentential network possesses an error-free self-organized force balance. CHL minimizes errors by (a) converting bottom-up flow (structure building) to top-down flow (parsing), (b) removing head projection edges, (c) preserving edges related to feature checking, (d) deleting DPmovement trajectories headed by an intermediate copy, (e) ensuring that covert wh-movement trajectories have infinitesimally small currents and conserving flow directions, and (f) robustly remedying a gap in wh-loop by using infinitesimally inexpensive wh-internally-merged (wh- IM) edge with the original flow direction. The CHL compels the sensorimotor (SM) interface to ground nodes so that Kirchhoff’s current law (a fundamental balance law) is satisfied. Internal merges are built-in grounding operations at the CHL–SM interface that generate loops and optimal force balance in sentential networks.