Shannon entropy
2024-10-15 10:56:26
Shannon entropy is one of the most important metrics in information theory. Entropy measures the uncertainty associated with a random variable, i.e. the expected value of the information in the message (in classical informatics it is
measured in bits).
The concept was introduced by Claude E. Shannon in the paper „A Mathematical Theory of Communication” (1948). Shannon entropy allows to estimate
the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.
The Shannon entropy is calculated using formula:
当b=2时,H(X)就表示这个变量可以用几个bit来表示。bit就是H(X)的单位。如一个变量表示掷硬币正反面,如果正反面的概率都是1/2,那么H(X)就为1,1个bit就可以表示这个变量。
最新文章
- 深入理解Java:String
- IOS之UI -- UITableView -- 1 -- 相关初识
- 慕课网-安卓工程师初养成-4-9 Java循环语句之 for
- 复习IOS多线程知识
- Servlet中Service方法
- JavaScript学习笔记之原型对象
- iOS 7 beta4 体验
- UVa 793 - Network Connections
- CentOS安装scp命令
- Android Stutio中使用java8的Lambda表达式
- 超简单的canvas绘制地图
- QT之setstylesheet防止子窗体继承父窗体样式
- Android导出数据库文件
- mysql通配符使用
- JS字符串和正则总结
- Django之集合函数使用与mysql表的创建特殊字段分析
- Threadlocal 传递参数(百度二面)
- uboot 网络驱动模型
- Telephone interview with Youyou Tu
- 探索TFS Git 库文件换行(CRLF)的处理方式