网站首页  情感咨询  情感美文  情感百科  情感生活  学习充电  旧版美文

请输入您要查询的词汇:

 

词汇 Weighted entropy
分类 英语词汇 英语翻译词典
释义

Weighted entropy

英语百科

Entropy (information theory)

2 shannons of entropy: Information entropy is the log-base-2 of the number of possible outcomes; with two coins there are four outcomes, and the entropy is two bits.
Entropy Η(X) (i.e. the expected surprisal) of a coin flip, measured in shannons, graphed versus the fairness of the coin Pr(X = 1), where X = 1 represents a result of heads.Note that the maximum of the graph depends on the distribution. Here, the entropy is at most 1 shannon, and to communicate the outcome of a fair coin flip (2 possible values) will require an average of at most 1 bit. The result of a fair die (6 possible values) would require on average log26 bits.

In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information.

随便看

 

依恋情感网英汉例句词典收录3870147条英语例句词条,基本涵盖了全部常用英语单词的释义及例句,是英语学习的有利工具。

 

Copyright © 2004-2024 Yiyi18.com All Rights Reserved
京ICP备2021023879号 更新时间:2025/8/7 14:34:20