首页    期刊浏览 2024年11月14日 星期四
登录注册

文章基本信息

  • 标题:Significance of Dimensionality Reduction in Image Processing
  • 本地全文:下载
  • 作者:Shereena V. B ; Julie M. David
  • 期刊名称:Signal & Image Processing : An International Journal (SIPIJ)
  • 印刷版ISSN:2229-3922
  • 电子版ISSN:0976-710X
  • 出版年度:2015
  • 卷号:6
  • 期号:3
  • 页码:27
  • 出版社:Academy & Industry Research Collaboration Center (AIRCC)
  • 摘要:The aim of this paper is to present a comparative study of two linear dimension reduction methods namelyPCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis). The main idea of PCA is totransform the high dimensional input space onto the feature space where the maximal variance isdisplayed. The feature selection in traditional LDA is obtained by maximizing the difference betweenclasses and minimizing the distance within classes. PCA finds the axes with maximum variance for thewhole data set where LDA tries to find the axes for best class seperability. The neural network is trainedabout the reduced feature set (using PCA or LDA) of images in the database for fast searching of imagesfrom the database using back propagation algorithm. The proposed method is experimented over a generalimage database using Matlab. The performance of these systems has been evaluated by Precision andRecall measures. Experimental results show that PCA gives the better performance in terms of higherprecision and recall values with lesser computational complexity than LDA.
  • 关键词:Color histogram; Feature Extraction; Euclidean distance; Principal Component Analysis; Linear;Discriminant Analysis; Eigen Values; Eigen Vectors; Neural network; Back Propagation.
国家哲学社会科学文献中心版权所有