BEST代写-线上编程学术专家

Best代写-最专业靠谱代写IT | CS | 留学生作业 | 编程代写Java | Python |C/C++ | PHP | Matlab | Assignment Project Homework代写

计算机代写|COMP2610/COMP6261 Information Theory Assignment 3 Semester 2 2022

计算机代写|COMP2610/COMP6261 Information Theory Assignment 3 Semester 2 2022

这是一篇来自澳洲的要求对一些问题证明结果的计算机代写

 

Question 1: Entropy and Joint Entropy [10 marks total]

**All students are expected to attempt this question.

An ordinary deck of cards containing 13 clubs, 13 diamonds, 13 hearts, and 13 spades cards is shufflfled and dealt out one card at time without replacement. Let Xi be the suit of the ith card.

(a) Determine H(X1). [4 marks]

(b) Determine H(X1,X2,··· ,X52). [6 marks]

Question 2: Source Coding [30 marks total]

Question 2-I [6 marks total]

**All students are expected to attempt this question.

Consider the code {0,01,011}.

(a) Is it instantaneous? [2 marks]

(b) Is it uniquely decodable? [2 marks]

(c) Is it nonsingular? [2 marks]

Question 2-II [12 marks total]

**All students are expected to attempt this question.

Construct a binary Huffman code and Shannon code (not Shannon-Fano-Elias code) for the following distribution on 5 symbols p = (0.3,0.3,0.2,0.1,0.1). What is the average length of these codes?

Question 2-III-A [For COMP2610 Students Only] [12 marks total]

**Only COMP2610 students are expected to attempt this question.

Consider the random variable

X = (x1 x2 x3 x4 x5 x6 x7

0.49 0.26 0.12 0.04 0.04 0.03 0.02)

(a) Find a binary Huffman code for X. [4 marks]

(b) Find the expected codelength for this encoding. [3 marks]

(c) Find a ternary Huffman code for X. [5 marks]

Question 2-III-B [For COMP6261 Students Only] [12 marks total]

**Only COMP6261 students are expected to attempt this question.

A random variable X takes on three values, e.g., a, b, and c, with probabilities 0.55, 0.25, and 0.2.

(a) What are the lengths of the binary Huffman codewords for X? What are the lengths of the binary Shannon codewords for X? [4 marks]

(b) What is the smallest integer D such that the expected Shannon codeword length with a D-ary alphabet equals the expected Huffman codeword length with a D-ary alphabet? [3 marks]

(c) Here X1 and X2 are independent with each other and take on three values, e.g., a, b, and c, with probabilities 0.55, 0.25, and 0.2. We defifine Y = X1X2, e.g., Y = ab if X1 = a and X2 = b. Find the binary Huffman codewords for Y. [5 marks]

Question 3: Channel Capacity [30 marks total]

Question 3-I [20 marks total]

**All students are expected to attempt this question.

There is a discrete memoryless channel (DMC) with the channel input X X = {1,2,3,4}. The channel output Y follows the following probabilistic rule.

Y = ( probabilit1/2

2probability 1/2)

Answer the following questions.

(a) Draw the schematic of the channel and clearly show possible channel outputs and the channel transition probabilities. [5 marks]

(b) Write the mutual information I(X;Y) as a function of the most general input probability distribution. [10 marks]

(c) Find a way of using only a subset of the channel inputs such that the channel turns into a noiseless channel and the maximum mutual information (you need to quantify its value) can be achieved with zero error. [5 marks]

Question 3-II [10 marks total]

**All students are expected to attempt this question.

The Z-channel has binary input and output alphabets and transition probabilities p(y|x) given by the following matrix:

p(y|x) =  1 1 0 /3 2/3 x, y ∈ {0,1}

Find the capacity of the Z-channel and the maximizing input probability distribution.

Question 4: Joint Typical Sequences [30 marks total]

Question 4-I [15 marks total]

**All students are expected to attempt this question.

Let (xn, yn,zn) be drawn according to the joint distribution p(x, y,z) in an independent and identically distributed (i.i.d.) manner. We say that (xn, yn,zn) is jointly ε-typical if all the following conditions are met

  • |˜H(xn)H(X)| ≤ ε
  • |˜H(yn)H(Y)| ≤ ε
  • |˜H(zn)H(Z)| ≤ ε
  • |˜H(xn, yn)H(X,Y)| ≤ ε
  • |˜H(xn,zn)H(X,Z)| ≤ ε
  • |˜H(yn,zn)H(Y,Z)| ≤ ε
  • |˜H(xn, yn,zn)H(X,Y,Z)| ≤ ε

where˜H(xn ) = 1n log2(p(xn )). Now suppose that (x˜n, y˜n, ˜zn) is drawn i.i.d. according to p(x), p(y),and p(z). Therefore, (x˜n, y˜n, ˜zn) have the same marginals as p(xn, yn,zn), but are independent. Find upper and lower bounds on the probability that (x˜n, y˜n, ˜zn) is jointly typical in terms of H(X,Y,Z),H(X), H(Y), H(Z), ε, and n.

Question 4-II [15 marks total]

**All students are expected to attempt this question.

Let p = [0.43,0.32,0.25] be the distribution of a random variable X that takes symbols from {a,b, c},respectively.

(a) Find the empirical entropy of the i.i.d. sequence

x = aabaabbcabaccab  [5 marks]

(Hints: the empirical entropy

˜H(xn ) = 1n log2(p(xn )).)

(b) Find whether it is a ε-typical sequence with ε = 0.05 [5 marks]

(c) Now assume the following joint probability distribution between X and Y that take symbols from {a,b, c} and {d, e, f } respectively.

p(x, y) = [0.2 0.08 0.15

0.1 0.15 0.07

0.1 0.1 0.05]

where in each row, x is fifixed. We observe two i.i.d. sequences

x = aabaabbcabaccab

y = d f f f d f edddee f dd

Determine whether (x,y) are jointly ε-typical. [5 marks]

bestdaixie

评论已关闭。