首页    期刊浏览 2024年07月08日 星期一
登录注册

文章基本信息

  • 标题:Rating Prediction with Topic Gradient Descent Method for Matrix Factorization in Recommendation
  • 本地全文:下载
  • 作者:Guan-Shen Fang ; Sayaka Kamei ; Satoshi Fujita
  • 期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
  • 印刷版ISSN:2158-107X
  • 电子版ISSN:2156-5570
  • 出版年度:2017
  • 卷号:8
  • 期号:12
  • DOI:10.14569/IJACSA.2017.081262
  • 出版社:Science and Information Society (SAI)
  • 摘要:In many online review sites or social media, the users are encouraged to assign a numeric rating and write a textual review as feedback to each product that they have bought. Based on users’ history of feedbacks, recommender systems predict how they assesses the unpurchased products to further discover the ones that they may like and buy in future. A traditional approach to predict the unknown ratings is matrix factorization, while it uses only the history of ratings included in the feedbacks. In recent researches, its ignorance of textual reviews is pointed out to be the drawback that brings mediocre performance. In order to solve such issue, we propose a method of rating prediction which uses both the ratings and reviews, including a new first-order gradient method for matrix factorization, named Topic Gradient Descent (TGD). The proposed method firstly derives the latent topics from the reviews via Latent Dirichlet Allocation. Each of the topics is characterized by a probability distribution of words and is assigned to correspond to a latent factor. Secondly, to predict the ratings of the users, it uses matrix factorizaiton which is trained by the proposed TGD method. In the training process, the updating step of each latent factor is dynamically assigned depending on the stochastic proportion of its corresponding topic in the review. In evaluation, we both use YELP challenge dataset and per-category Amazon review datasets. The experimental results show that the proposed method certainly converges the squared error of the prediction, and improves the performance of traditional matrix factorization up to 12.23%.
  • 关键词:Gradient descent; matrix factorization; Latent Dirichlet Allocation; information recommendation
国家哲学社会科学文献中心版权所有