1. Information Theory and Entropy

    The way I think of Shannon’s information theoretic concept of entropy is as “uncertainty.” Or, in another approach, Shannon’s entropy — and there are many types of entropy — is a measure of how resistant “information content” is to compression: more entropy/uncertainty, means less redundancy, and consequently less compression …

    read more

    There are comments.

« Page 2 / 3 »