Multi-pruning of decision trees for knowledge representation and classification

Mohammad Azad, Igor Chikalov, Shahid Hussain, Mikhail Moshkov

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Scopus citations


We consider two important questions related to decision trees: first how to construct a decision tree with reasonable number of nodes and reasonable number of misclassification, and second how to improve the prediction accuracy of decision trees when they are used as classifiers. We have created a dynamic programming based approach for bi-criteria optimization of decision trees relative to the number of nodes and the number of misclassification. This approach allows us to construct the set of all Pareto optimal points and to derive, for each such point, decision trees with parameters corresponding to that point. Experiments on datasets from UCI ML Repository show that, very often, we can find a suitable Pareto optimal point and derive a decision tree with small number of nodes at the expense of small increment in number of misclassification. Based on the created approach we have proposed a multi-pruning procedure which constructs decision trees that, as classifiers, often outperform decision trees constructed by CART. © 2015 IEEE.
Original languageEnglish (US)
Title of host publication2015 3rd IAPR Asian Conference on Pattern Recognition (ACPR)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages5
ISBN (Print)9781479961009
StatePublished - Jun 9 2016

Bibliographical note

KAUST Repository Item: Exported on 2020-10-01


Dive into the research topics of 'Multi-pruning of decision trees for knowledge representation and classification'. Together they form a unique fingerprint.

Cite this