CMX Student/Postdoc Seminar
Random forests are a popular class of algorithms used for regression and classification that are ensembles of randomized decision trees built from axis-aligned partitions of the feature space. One such variant, called Mondrian random forests, were proposed to handle the online setting and are the first class of random forests for which minimax rates were obtained in arbitrary dimension. However, the restriction to axis-aligned splits fails to capture dependencies between features, and oblique splits have shown improved empirical performance. By viewing the Mondrian as a special case of the stable under iterated (STIT) process in stochastic geometry, we utilize the theory of stationary random tessellations to show that STIT random forests achieve minimax rates for Lipschitz and C^2 functions and illustrate how they can overcome the curse of dimensionality in high dimensional feature space.