#
OEF Entropy
--- Introduction ---

This module actually contains 19 exercises on entropy in information
theory.

### Decimals-3

Compute the binary entropy of the following distribution of 3 probabilities:

{ , , }

### Decimals-4

Compute the binary entropy of the following distribution of 4 probabilities:

{ , , , }

### Decimals-5

Compute the binary entropy of the following distribution of 5 probabilities:

{ , , , , }

### Given entropy - 2

Please find a distribution of probabilities on a system of two elements {A,B}, such that the binary entropy of the system is equal to . The two probabilities P(A), P(B) must be positive, and their sum must be equal to 1.

### Given entropy - 3

Please find a distribution of probabilities on a system of three elements {A,B,C}, such that the binary entropy of the system is equal to . The three probabilities P(A), P(B), P(C) must be positive, and their sum must be equal to 1.

### Given entropy - 4

Please find a distribution of probabilities on a system of four elements {A,B,C,D}, such that the binary entropy of the system is equal to . The four probabilities P(A), P(B), P(C), P(D) must be positive, and their sum must be equal to 1.

### Existence of entropy - 3

Is there a system of 3 elements with a binary entropy equal to ?

### Existence of entropy

Is there a system of elements with a binary entropy equal to ?

### Computer file

A computer file weighs bytes. The content of the file only contains bytes of 4 values, as shown by the following table. Decimal value | | | | |
---|

Binary value | | | | |
---|

Number of bytes | | | | |
---|

Compute the binary entropy of the file, according to the numbers of bits and according to the number of bytes.

### Computer file II

A computer file weighs bytes. The content of the file only contains bytes of 6 values, as shown by the following table. The binary entropy of the file is
according to the numbers of bytes.

By recoding the bytes of the file by an optimal binary code of variable length, one can reduce the size of the file to
bytes (without counting eventual headers).

### Color image

A color image of × pixels is composed of 5 colors: black, white, red, green, blue. Here is the number of pixels of each color in the image. Color | Black | White | Red | Green | Blue |
---|

Pixels | | | | | |
---|

Compute the binary entropy of the image according to the numbers of pixels.

### Max of entropy - 3

We have a system of distribution of probabilities on 3 elements {A,B,C}. Given that the probability P(A) = , what is the maximum and the minimum of the binary entropy of the system?

### Max of entropy II - 3

We have a system of distribution of probabilities on 3 elements {A,B,C}. Given that the probability P() = and that P(A) P(B) P(C) , what is the maximum and the minimum of the binary entropy of the system?

### Rationals-3

Compute the binary entropy of the following distribution of 3 probabilities:

{ }

### Rationals-4

Compute the binary entropy of the following distribution of 4 probabilities:

{ }

### Rationals-5

Compute the binary entropy of the following distribution of 5 probabilities:

{ }

### Rationals-6

Compute the binary entropy of the following distribution of 6 probabilities:

{ }

### Conditional existence - 3

Is there a system of 3 elements {A,B,C}, such that P(A) = , and that the binary entropy is equal to ?

### Conditional existence - 4

Is there a system of 4 elements {A,B,C,D}, such that P(A) = , P(B) = , and that the binary entropy is equal to ?
The most recent version

**This page is not in its usual appearance because WIMS is unable to recognize your
web browser.**
Please take note that WIMS pages are interactively generated; they are not ordinary
HTML files. They must be used interactively ONLINE. It is useless
for you to gather them through a robot program.

- Description: collection of exercises on entropy. WIMS site
- Keywords: interactive mathematics, interactive math, server side interactivity, information theory, probability, Shanon, code