In this paper, we consider decision trees that use both conventional queries based on one attribute each and queries based on hypotheses of values of all attributes. Such decision trees are similar to those studied in exact learning, where membership and equivalence queries are allowed. We present greedy algorithm based on entropy for the construction of the above decision trees and discuss the results of computer experiments on various data sets and randomly generated Boolean functions.
|Original language||English (US)|
|State||Published - Jun 25 2021|
Bibliographical noteKAUST Repository Item: Exported on 2021-06-28
Acknowledgements: Research reported in this publication was supported by King Abdullah University of Science and Technology (KAUST), including the provision of computing resources. The authors are greatly indebted to the anonymous reviewers for useful comments and suggestions.
ASJC Scopus subject areas
- Physics and Astronomy (miscellaneous)
- Statistical and Nonlinear Physics