期刊名称:International Journal of Computer Science & Technology
印刷版ISSN:2229-4333
电子版ISSN:0976-8491
出版年度:2013
卷号:4
期号:2
页码:256-257
语种:English
出版社:Ayushmaan Technologies
摘要:Entropy of a language is a statistical parameter which measures, in a certain sense, how much information is produced on the average for each letter of a text in a language. The amount of information carried in the arrangement of words is the same across all languages, even languages that aren’t related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech. This paper is about methods, techniques available and work carried out so far for finding Entropy of Languages.
关键词:Entropy;Probability;Corpus;Complexity;Information;Frequency distribution;Hindi;Natural Linguistic Programming