Entropy is the measure of
(a) Amount of information at the output
(b) Amount of information that can be transmitted
(c) Number of error bits from total number of bits
(d) None of the mentioned
I have been asked this question in my homework.
My question is based upon Shannon Hartley Theorem and Turbo Codes topic in portion Channel Coding of Digital Communications