Information theory
Information theory studies thequantification of information, its storage, and communication. We introduce concepts of information entropy and information gain, which are used to construct a decision tree using the ID3 algorithm.
Information entropy
The information entropy of any given piece data is a measure of the smallest amount of information necessary to represent a data item from that data. The units of information entropy are familiar - bits, bytes, kilobytes, and so on. The lower the information entropy, the more regular the data is, and the more patterns occur in the data, thus, the smaller the quantity of information required to represent it. That is how compression tools on computers can take large text files and compress them to a much smaller size, as words and word expressions keep reoccurring, forming a pattern.
Coin flipping
Imagine we flip an unbiased coin. We would like to know whether the result is heads or tails. How much information do we need to represent...