微分熵
最大熵概率分布
熵率
数学
联合量子熵
熵权不等式
雷诺熵
最小熵
最大熵原理
熵(时间箭头)
统计物理学
最大熵热力学
信息论
统计
物理
热力学
出处
期刊:Synthesis lectures on engineering, science, and technology
[Morgan & Claypool]
日期:2024-08-05
卷期号:: 13-21
标识
DOI:10.1007/978-3-031-65388-9_3
摘要
The entropy associated with continuous random variables is distinct from the entropy of discrete random variables; in fact, it is sufficiently different in its properties that it is given a new name, differential entropy. Treatments of differential entropy are standard in basic information theory texts and in any first course in information theory. So, much of the material in Sect. 3.2 is widely available. The topics of entropy rate, entropy power, and maximum entropy, while usually contained in the same textbooks and introductory courses, are not as extensively emphasized as in this chapter and this book. Further, the developments here are somewhat separated from information theoretic treatments of communications and compression and generalized so that the utility of differential entropy for exploring sequences as in agent learning is more evident.
科研通智能强力驱动
Strongly Powered by AbleSci AI