Berkeley IEOR Undergrad Wins Best Student Paper Award at KDIR 2020
IEOR Undergraduate Jonathan Bodine (Class of ’21) and Professor Dorit Hochbaum won the Best Student Paper Award at the 12th International Conference on Knowledge Discovery and Information Retrieval (KDIR 2020). The award winning work authored by Bodine and Hochbaum was titled The Max-Cut Decision Tree: Improving on the Accuracy and Running Time of Decision Trees. The paper suggests modifications to standard decision tree constructions to advance their capabilities for difficult classification tasks. These modifications include maximizing the distance between pairs of observations belonging to separate classes and selecting decision features from a linear combination of the input features.
In the Summer of 2020, Jonathan worked as a student researcher with Prof. Hochbaum and has also been the Undergraduate Student Instructor for Prof. Hochbaum’s course – IEOR 162 – since Fall 2019.
Berkeley IEOR congratulates Jonathan and Prof. Hochbaum for this tremendous achievement! Below is the abstract for the award winning submission.
Decision trees are a widely used method for classification, both alone and as the building blocks of multiple different ensemble learning methods. The Max-Cut decision tree involves novel modifications to a standard, baseline model of classification decision tree, precisely CART Gini. One modification involves an alternative splitting metric, Maximum Cut, which is based on maximizing the distance between all pairs of observations that belong to separate classes and separate sides of the threshold value. The other modification is to select the decision feature from a linear combination of the input features constructed using Principal Component Analysis (PCA) locally at each node. Our experiments show that this node-based, localized PCA with the novel splitting modification can dramatically improve classification, while also significantly decreasing computational time compared to the baseline decision tree. Moreover, our results are most significant when evaluated on data sets with higher dimensions, or more classes. For the example data set CIFAR-100, the modifications enabled a 49% improvement in accuracy, relative to CART Gini, while reducing CPU time by 94% for comparable implementations. These introduced modifications will dramatically advance the capabilities of decision trees for difficult classification tasks.Bodine, J. and Hochbaum, D., The Max-Cut Decision Tree: Improving on the Accuracy and Running Time of Decision Trees.
In Proceedings of the 12th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K 2020) – Volume 1: KDIR, pages 59-70. ISBN: 978-989-758-474-9