首页    期刊浏览 2025年02月23日 星期日
登录注册

文章基本信息

  • 标题:Markov Models Applications in Natural Language Processing: A Survey
  • 本地全文:下载
  • 作者:Talal Almutiri ; Farrukh Nadeem
  • 期刊名称:International Journal of Information Technology and Computer Science
  • 印刷版ISSN:2074-9007
  • 电子版ISSN:2074-9015
  • 出版年度:2022
  • 卷号:14
  • 期号:2
  • DOI:10.5815/ijitcs.2022.02.01
  • 语种:English
  • 出版社:MECS Publisher
  • 摘要:Markov models are one of the widely used techniques in machine learning to process natural language. Markov Chains and Hidden Markov Models are stochastic techniques employed for modeling systems that are dynamic and where the future state relies on the current state.  The Markov chain, which generates a sequence of words to create a complete sentence, is frequently used in generating natural language. The hidden Markov model is employed in named-entity recognition and the tagging of parts of speech, which tries to predict hidden tags based on observed words. This paper reviews Markov models' use in three applications of natural language processing (NLP): natural language generation, named-entity recognition, and parts of speech tagging. Nowadays, researchers try to reduce dependence on lexicon or annotation tasks in NLP. In this paper, we have focused on Markov Models as a stochastic approach to process NLP. A literature review was conducted to summarize research attempts with focusing on methods/techniques that used Markov Models to process NLP, their advantages, and disadvantages. Most NLP research studies apply supervised models with the improvement of using Markov models to decrease the dependency on annotation tasks. Some others employed unsupervised solutions for reducing dependence on a lexicon or labeled datasets.
  • 关键词:Hidden Markov Models;Markov Chains;Named Entity Recognition;Natural Language Generation;Natural Language Processing;Parts of Speech Tagging;Quantitative Analysis
国家哲学社会科学文献中心版权所有