Download [397 p. COMPLETE SOLUTIONS] Elements of Information Theory 2nd Edition - COMPLETE solutions manual (chapters 1-17) PDF

Title[397 p. COMPLETE SOLUTIONS] Elements of Information Theory 2nd Edition - COMPLETE solutions manual (chapters 1-17)
File Size1.9 MB
Total Pages397
Document Text Contents
Page 198

Channel Capacity 197

32. Random “20” questions
Let X be uniformly distributed over {1, 2, . . . ,m} . Assume m = 2n . We ask random
questions: Is X ∈ S1 ? Is X ∈ S2 ?...until only one integer remains. All 2m subsets S
of {1, 2, . . . ,m} are equally likely.

(a) How many deterministic questions are needed to determine X ?

(b) Without loss of generality, suppose that X = 1 is the random object. What is
the probability that object 2 yields the same answers for k questions as object 1?

(c) What is the expected number of objects in {2, 3, . . . ,m} that have the same
answers to the questions as does the correct object 1?

(d) Suppose we ask n +

n random questions. What is the expected number of

wrong objects agreeing with the answers?

(e) Use Markov’s inequality Pr{X ≥ tµ} ≤ 1
t
, to show that the probability of error

(one or more wrong object remaining) goes to zero as n −→ ∞ .

Solution: Random “20” questions. (Repeat of Problem 5.45)

(a) Obviously, Huffman codewords for X are all of length n . Hence, with n deter-
ministic questions, we can identify an object out of 2n candidates.

(b) Observe that the total number of subsets which include both object 1 and object
2 or neither of them is 2m−1 . Hence, the probability that object 2 yields the same
answers for k questions as object 1 is (2m−1/2m)k = 2−k .

More information theoretically, we can view this problem as a channel coding
problem through a noiseless channel. Since all subsets are equally likely, the
probability the object 1 is in a specific random subset is 1/2 . Hence, the question
whether object 1 belongs to the k th subset or not corresponds to the k th bit of
the random codeword for object 1, where codewords Xk are Bern(1/2 ) random
k -sequences.

Object Codeword
1 0110 . . . 1
2 0010 . . . 0
...

Now we observe a noiseless output Y k of Xk and figure out which object was
sent. From the same line of reasoning as in the achievability proof of the channel
coding theorem, i.e. joint typicality, it is obvious the probability that object 2 has
the same codeword as object 1 is 2−k .

(c) Let

1j =

{

1, object j yields the same answers for k questions as object 1
0, otherwise

,

for j = 2, . . . ,m.

Similer Documents