Entropy using Decision Tree's
- by Matt Clements
Train a decision tree on the data represented by attributes A1, A2, A3 and outcome C described below:
A1 A2 A3 C
1 0 1 0
0 1 1 1
0 0 1 0
For log2(1/3) = 1.6 and log2(2/3) = 0.6, answer the following questions:
a) What is the value of entropy H for the given set of training example?
b) What is the portion of the positive samples split by attribute A2?
c) What is the value of information gain, G(A2), of attribute A2?
d) What is IFTHEN rule(s) for the decision tree?