This is a continuation of my first post on the SEG Facies Classification competition.
In the previous post I left off before implementing a baseline learner to see what results we might expect.
Baseline Learner After Witten, Frank, and Hall’s Data Mining, I use depth-1 decision stumps and shallow decision trees as baseline learners. Decision trees are easily interpretable once plotted. They also give us a sense of feature importance, since the greedy algorithm will split the more informative features earlier in the tree.