Weather decision tree source code
no vote
Information Gain method for constructing decision tree.
Calculate the entropy of branch
"Sunny" branch with 2 example 3 a counterexample, it expects the volume of information required for:
M (sunny) =-2/5*Log2 (2/5), -3/5*Log2 (three-fifths)
=0. 971 bits
"Cloudy" branch, containing 4 0 a counterexample:
M (for cloudy days) =0
"Rain" branch, containing 3 example 2 a counterexample:
M (for rain), =-3/5*Log2 (3/5) -2/5*Log2 (two-fifths)