首页    期刊浏览 2024年09月07日 星期六
登录注册

文章基本信息

  • 标题:Improved Distance Functions for Instance-Based Text Classification
  • 本地全文:下载
  • 作者:Khalil El Hindi ; Bayan Abu Shawar ; Reem Aljulaidan
  • 期刊名称:Computational Intelligence and Neuroscience
  • 印刷版ISSN:1687-5265
  • 电子版ISSN:1687-5273
  • 出版年度:2020
  • 卷号:2020
  • 页码:1-10
  • DOI:10.1155/2020/4717984
  • 出版社:Hindawi Publishing Corporation
  • 摘要:Text classification has many applications in text processing and information retrieval. Instance-based learning (IBL) is among the top-performing text classification methods. However, its effectiveness depends on the distance function it uses to determine similar documents. In this study, we evaluate some popular distance measures’ performance and propose new ones that exploit word frequencies and the ordinal relationship between them. In particular, we propose new distance measures that are based on the value distance metric (VDM) and the inverted specific-class distance measure (ISCDM). The proposed measures are suitable for documents represented as vectors of word frequencies. We compare these measures’ performance with their original counterparts and with powerful Naïve Bayesian-based text classification algorithms. We evaluate the proposed distance measures using the kNN algorithm on 18 benchmark text classification datasets. Our empirical results reveal that the distance metrics for nominal values render better classification results for text classification than the Euclidean distance measure for numeric values. Furthermore, our results indicate that ISCDM substantially outperforms VDM, but it is also more susceptible to make use of the ordinal nature of term-frequencies than VDM. Thus, we were able to propose more ISCDM-based distance measures for text classification than VDM-based measures. We also compare the proposed distance measures with Naïve Bayesian-based text classification, namely, multinomial Naïve Bayes (MNB), complement Naïve Bayes (CNB), and the one-versus-all-but-one (OVA) model. It turned out that when kNN uses some of the proposed measures, it outperforms NB-based text classifiers for most datasets.
国家哲学社会科学文献中心版权所有