首页    期刊浏览 2024年11月29日 星期五
登录注册

文章基本信息

  • 标题:Assessing Conceptual Complexity and Compressibility Using Information Gain and Mutual Information
  • 本地全文:下载
  • 作者:Fabien Mathy
  • 期刊名称:Tutorials in Quantitative Methods for Psychology
  • 电子版ISSN:1913-4126
  • 出版年度:2010
  • 卷号:6
  • 期号:1
  • 页码:16-30
  • DOI:10.20982/tqmp.06.1.p016
  • 出版社:Université de Montréal
  • 摘要:In this paper, a few basic notions stemming from information theory are presented with the intention of modeling the abstraction of relevant information in categorization tasks. In a categorization task, a single output variable is the basis for performing a dichotomic classification of objects that can be distinguished by a set of input variables which are more or less informative about the category to which the objects belong. At the beginning of the experiment, the target classification is unknown to learners who must select the most informative variables relative to the class in order to succeed in classifying the objects efficiently. I first show how the notion of entropy can be used to characterize basic psychological processes in learning. Then, I indicate how a learner might use information gain and mutual information –both based on entropy– to efficiently induce the shortest rule for categorizing a set of objects. Several basic classification tasks are studied in succession with the aim of showing that learning can improve as long as subjects are able to compress information. Referring to recent experimental results, I indicate in the Conclusion that these notions can account for both strategies and performance in subjects trying to simplify a learning process.
国家哲学社会科学文献中心版权所有