求条件熵和互信息的matlab实现

2025-06-15

Assignment 1_08116649_Chaoyun_Song

EEE315 Information Theory and

Coding Assignment 1

Channel Capacity and Mutual

Information

ID : 08116649 Name: Chaoyun.Song

Assignment 1_08116649_Chaoyun_Song

1. Introduction

Shannon's information content should have some intuitive properties :1( ()Information contained in the events ought to be defined in terms of some measure of uncertainty of the events. Less certain events ought to contain more information than more certain events. The information of unrelated events taken as a single event should equal the sum of the information of the unrelated events.

In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable. Entropy is defined as:

H(x)???p(x)logp(x)it can be viewed as: a measure of the minimum cost needed to send some form of information; “the amount of surprise factor” of the information measured in bits. or how much energy it is worth spending to carry the information which translates to the minimum number of bits needed to code the information.

In probability theory and information theory, the mutual

information (sometimes known by the archaic termtransinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used. The mutual information can be defined as:

where p(x,y) is the joint probability distribution function of X and Y, and p1(x) and p2(y) are the marginal probability distribution functions of X and Y respectively.

Assignment 1_08116649_Chaoyun_Song

2. Result with Matlab stripts and functions

(1). Write a Matlab function to calculate the entropy of a source given a discrete distribution. Cacluate the entropy for the following distribution. Plot the entropy diagram for each of the distributions. A={B={

C={0.1, 0.31, 0.001, 0.009, 0.2, 0.15, 0.23} Solution: Matlab code

>> A=[1/2 1/4 1/8 1/8]; >> H1=-sum(A.*log2(A)) H1 =

1.7500

>> B=[1/4 1/4 1/4 1/4]; >> H2=-sum(B.*log2(B)) H2 = 2

>> C=[0.1, 0.31, 0.001, 0.009, 0.2, 0.15, 0.23]; >> H3= -sum(C.*log2(C))

1111,,248814}

14,

14,

14,}

Assignment 1_08116649_Chaoyun_Song

H3 =

2.2897

(2). Write a MATLAB[1] script to plot the capacity of a binary symmetric channel with cross probability p as function of p where 0

?p ? 1. For what value of p is the capacity minimized and what is

the minimum value?

For a binary symmetric channel(BSC), we know that P(0|1)=P(1|0)=p, P(0|0)=P(1|1)=1-p, which ?p? is the cross probability 0 ?p ? 1. When P(Y0)=P(Y1)=0.5 the mutual information comes to the minimum value. The capacity of this channel is like:

C= P(X0)P(0|0)log[P(0|0)/0.5]+ P(X0)P(1|0)log[P(1|0)/0.5]+ P(X1)P(0|1)log[P(0|1)/0.5]+ P(X1)P(1|1)log[P(1|1)/0.5] =plog2p+(1-p)log2(1-p)+1

Using matlab we can plot the diagram of ?p? and ?C?

Assignment 1_08116649_Chaoyun_Song

Solution: Matlab Code: >> p=0 : 0.01 : 1;

>> C=p.*log2(p)+(1-p).*log2(1-p)+1; >> plot(p,C)

From the diagram, we can see the change of ?C? with different value of ?p?. When p=1 the channel capacity is 1(bit/symbol)

When p=0.5 there are no information and the mutual information is 0 When 0.5 ?p ? 1 the diagram is same as leftside

So p=0.5 is the capacity minimized, the minimum value of C is 0.

(3). A binary non-symmetric channel is characterized by the probabilities P(0|1) =0.1 and P(1|0) = 0.2.


求条件熵和互信息的matlab实现.doc 将本文的Word文档下载到电脑 下载失败或者文档不完整,请联系客服人员解决!

下一篇:海康DVR网络配置说明

相关阅读
本类排行
× 游客快捷下载通道(下载后可以自由复制和排版)

下载本文档需要支付 7

支付方式:

开通VIP包月会员 特价:29元/月

注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信:xuecool-com QQ:370150219