# Decision Tree Induction and Entropy in data mining

## Decision Tree Induction

A decision tree is a tree-like structure and consists of following parts(discussed in Figure 1);

1. Root node:
• age is the root node
2. Branches:
• Following are the branches;
• <20
• 21…50
• >50
• USA
• PK
• High
• Low
3. Leaf node:
• Following are the leaf nodes;
• Yes
• No

## Entropy:

Entropy is a method to measure uncertainty.

• Entropy can be measured between 0 and 1.
• High entropy represents that data have more variance with each other.
• Low entropy represents that data have less variance with each other.

P = Total yes = 9

N = Total no = 5

Note that to calculate the Â log2Â of a number, we can do the following procedure.

For example;

what isÂ Â log2Â Â of 0.642?

Ans: log (0.642) / log (2)

=9/14 * log2(9/14) Â – Â 5/14 * log2 (5/14)

=-9/14 * log2(0.642) Â – Â 5/14 * log2 (0.357)

=-9/14 * (0.639) Â – Â 5/14 * (-1.485)

=0.941

For Age:

 age Pi Ni Info(Pi,Â Ni) <20 Â 2 YES Â 3 NO Â 0.970 21…50 Â 4 YES 0 NO Â 0 >50 Â 3 YES Â 2 NO Â Â 0.970

Note: if yes =2 and No=3 then entropyÂ is 0.970 and it is same Â 0.970 if yes=3 and No=2

So here when we calculate the entropy for age<20, then there is no need to calculate the entropy for age >50 because the total number of Yes and No is same.

 The gain of Age 0.248 0.248 is a greater value than income, Credit Rating, and Region. So Age will be considered as the root node. Gain of Income 0.029 Â Gain of Credit Rating 0.048 Â Gain of Â Region 0.151 Â

Note that

• if yes and no are in the following sequence like (0, any number) or (any number, 0) then entropy is always 0.
• If yes and no are occurring in such a sequence (3,5) and (5, 3) then both have the same entropy.
• Entropy calculates the impurity or uncertainty of data.
• If the coin is fair (1/2, head and tail have equal probability, represent maximum uncertainty because it is difficult to guess that head occurs or tails occur) and suppose coin has the head on both sides then the probability is 1/1, and uncertainty or entropy is less.
• if p is equal to q then more uncertainty
• if p is not equal to q then less uncertainty

Now again calculate entropy for;

1. Income
2. Region
3. Credit

ForÂ Income:

 Income Pi Ni Info(Pi,Â Ni) High 0 YES 2 NO 0 Medium Â 1 YES 1 NO 1 Low Â 1 YES 0 NO 0

For Region:

 Region Pi Ni Info(Pi,Â Ni) USA 0 YES 3 NO 0 PK 2 YES 0 NO 0

For Credit Rating:

 Credit Rating Pi Ni Info(Pi,Â Ni) Low 1 YES 2 NO 0 High 1 YES 1 NO 0

 The gain ofÂ Region 0.97 0.970 is a greater value than income, Credit Rating, and Region. So Age will be considered as the root node. Gain ofÂ Credit Rating 0.02 Â Gain ofÂ Income 0.57 Â

Similarly, you can calculate for all.

## Next Similar Tutorials

2. Decision Tree Induction and Entropy in data mining – Click Here
5. Computing Information-Gain for Continuous-Valued Attributes in data mining – Click Here