Finding the smallest decision tree that perfectly fits the training data is NP-complete; yet, such trees remain attractive due to their interpretability and minimal footprint. Existing solutions occupy two extremes: heuristics like CART instantly produce trees but remain far from
...
Finding the smallest decision tree that perfectly fits the training data is NP-complete; yet, such trees remain attractive due to their interpretability and minimal footprint. Existing solutions occupy two extremes: heuristics like CART instantly produce trees but remain far from optimal, whereas exact solvers like Witty give no intermediate output. We introduce AnyDTree, an anytime algorithm that continuously maintains a 100% accurate tree and monotonically shrinks its size. It employs an expand-and-backtrack search that ensures complete solutions at every step, combined with aggressive pruning and caching mechanisms to eliminate redundant exploration. On 70 binary-classification variants of 35 UCI datasets, AnyDTree incurs no statistically significant overhead in finding the optimal size compared with Witty (log-rank p>0.1). On the 46 datasets with known optima, it demonstrates significantly improved anytime behaviour - measured by the confined primal integral - with a median score of 0.00034, outperforming both Witty (0.00059) and CART (0.20) (p<0.001). These results position AnyDTree as a practical middle ground between heuristic and exact solutions.