In this research, we consider decision trees that incorporate standard queries with one feature per query as well as hypotheses consisting of all features’ values. These decision trees are used to represent knowledge and are comparable to those investigated in exact learning, in which membership queries and equivalence queries are used. As an application, we look into the issue of creating decision trees for two cases: the sorting of a sequence that contains equal elements and multiple-value decision tables which are modified from UCI Machine Learning Repository. We contrast the efficiency of several forms of optimal (considering the parameter depth) decision trees with hypotheses for the aforementioned applications. We also investigate the efficiency of decision trees built by dynamic programming and by an entropy-based greedy method. We discovered that the greedy algorithm produces very similar results compared to the results of dynamic programming algorithms. Therefore, since the dynamic programming algorithms take a long time, we may readily apply the greedy algorithms.
Bibliographical noteKAUST Repository Item: Exported on 2023-03-27
Acknowledgements: The APC was funded by King Abdullah University of Science & Technology. The authors acknowledge the help and support from King Abdullah University of Science and Technology (KAUST) and Jouf University (JU) for this research.
ASJC Scopus subject areas
- Physics and Astronomy (miscellaneous)
- Statistical and Nonlinear Physics