首页    期刊浏览 2024年11月28日 星期四
登录注册

文章基本信息

  • 标题:Accelerated Singular Value Decomposition (ASVD) using momentum based Gradient Descent Optimization
  • 本地全文:下载
  • 作者:Sandeep Kumar Raghuwanshi ; Rajesh Kumar Pateriya
  • 期刊名称:Journal of King Saud University @?C Computer and Information Sciences
  • 印刷版ISSN:1319-1578
  • 出版年度:2021
  • 卷号:33
  • 期号:4
  • 页码:447-452
  • DOI:10.1016/j.jksuci.2018.03.012
  • 出版社:Elsevier
  • 摘要:The limitations of neighborhood-based Collaborative Filtering (CF) techniques over scalable and sparse data present obstacle for efficient recommendation systems. These techniques show poor accuracy and dismal speed in generating recommendations. Model-based matrix factorization is an alternative approach use to overcome aforementioned limitations of CF. Singular value decomposition (SVD) is widely used technique to get low-rank factors of rating matrix and use Gradient Descent (GD) or Alternative Least Square (ALS) for optimization of its error objective function. Most researchers have focused on the accuracy of predictions but they did not accumulate the convergence rate of learning approach. In this paper, we propose a new filtering technique that implements SVD using Stochastic Gradient Descent (SGD) optimization and provides an accelerated version of SVD for fast convergence of learning parameters with improved classification accuracy. Our proposed method accelerates SVD in the right direction and dampens oscillation by adding a momentum value in parameters updates. To support our claim, we have tested our proposed model against the famed real world datasets (MovieLens100k, FilmTrust and YahooMovie). The proposed Accelerated Singular Value Decomposition (ASVD) outperformed the existing models and achieved higher convergence rate and better classification accuracy.
  • 关键词:Gradient Descent ; Information filtering ; Matrix factorization ; Singular value decomposition ; Stochastic gradient descent
国家哲学社会科学文献中心版权所有