If you construct a tree where all the decision are exactly the same.
rules that are comprehensible for human beings. But if the received decision tree is very large – that means that the rule set will be large too. Unfortunately, large tree sizes do not mean better expert systems .
There are various tree pruning methods designed to reduce tree sizes and to increase the expected tree accuracy. There are two. If you filter something out by choosing one branch over another branch in the tree, the observations you did not choose are forever lost.
Add a comment.
But to directly answer your question - no, it does not always make it more general. If you construct a tree where all the decision are exactly the same, then pruning.
The principle of parsimony suggests that one should seek to avoid pruning, because larger trees are almost always better. Both categorical and continuous independent variables may be used in a decision tree)If-Then rules can be written for all of the ending (terminal) nodes. There are several approaches to avoiding overfitting in building decision trees.
Pre-pruning that stop growing the tree earlier, before it perfectly classifies the training set. Post-pruning that allows the tree to perfectly classify the training set, and then post prune the tree.