首页    期刊浏览 2024年09月18日 星期三
登录注册

文章基本信息

  • 标题:Discrimination between Gamma and Log-Normal Distributions by Ratio of Minimized Kullback-Leibler Divergence
  • 本地全文:下载
  • 作者:Ali Akbar Bromideh ; Reza Valizadeh
  • 期刊名称:Pakistan Journal of Statistics and Operation Research
  • 印刷版ISSN:2220-5810
  • 出版年度:2014
  • 卷号:9
  • 期号:4
  • 页码:443-453
  • 语种:English
  • 出版社:College of Statistical and Actuarial Sciences
  • 摘要:The Gamma and Log-Normal distributions are frequently used in reliability to analyze lifetime data. The two distributions overlap in many cases and make it difficult to choose the best one. The ratio of maximized likelihood (RML) has been extensively used in choosing between them. But the Kullback-Leibler information is a measure of uncertainty between two functions, hence in this paper, we examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the Gamma or Log-Normal distribution. Therefore, the ration of minimized Kullback-Leibler Divergence (RMKLD) test statistic is introduced and its applicability will be explained by two real data sets. Although the consistency of the new test statistic with RML is convinced, but the KLD has higher probability of correct selection when the null hypothesis is Gamma. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}
  • 关键词:Gamma Distribution, Kullback-Leibler Divergence, Log-Normal Distribution, Model Discrimination, Probability of Correct Selection, Ratio of Maximized Likelihood
国家哲学社会科学文献中心版权所有