Entropy using Decision Tree's

Posted by Matt Clements on Stack Overflow See other posts from Stack Overflow or by Matt Clements
Published on 2010-05-30T12:09:49Z Indexed on 2010/05/30 12:12 UTC
Read the original article Hit count: 361

Train a decision tree on the data represented by attributes A1, A2, A3 and outcome C described below:

A1 A2 A3 C
1  0  1  0
0  1  1  1
0  0  1  0

For log2(1/3) = 1.6 and log2(2/3) = 0.6, answer the following questions:

a) What is the value of entropy H for the given set of training example?

b) What is the portion of the positive samples split by attribute A2?

c) What is the value of information gain, G(A2), of attribute A2?

d) What is IFTHEN rule(s) for the decision tree?

© Stack Overflow or respective owner

Related posts about artificial-intelligence

Related posts about artificial-neural-network