TY - CHAP

T1 - Infinite Binary Information Systems. Decision Trees of Types 4 and 5

AU - Azad, Mohammad

AU - Chikalov, Igor

AU - Hussain, Shahid

AU - Moshkov, Mikhail

AU - Zielosko, Beata

N1 - KAUST Repository Item: Exported on 2022-12-02

PY - 2022/11/19

Y1 - 2022/11/19

N2 - In this chapter, based on results of exact learning, test theory, and rough set theory, we study arbitrary infinite binary information systems each of which consists of an infinite set of elements and an infinite set of two-valued functions (attributes) defined on the set of elements. We consider the notion of a problem over information system, which is described by a finite number of attributes: for a given element, we should recognize values of these attributes. As algorithms for problem solving, we consider decision trees of two types: (4) using only proper hypotheses (an analog of proper equivalence queries from exact learning), and (5) using both attributes and proper hypotheses. As time complexity, we study the depth of decision trees. In the worst case, with the growth of the number of attributes in the problem description, the minimum depth of decision trees of both types either is bounded from above by a constant or grows as a logarithm, or linearly. Based on these results and results obtained in Chap. 7 for attributes and arbitrary hypotheses, we divide the set of all infinite binary information systems into seven complexity classes.

AB - In this chapter, based on results of exact learning, test theory, and rough set theory, we study arbitrary infinite binary information systems each of which consists of an infinite set of elements and an infinite set of two-valued functions (attributes) defined on the set of elements. We consider the notion of a problem over information system, which is described by a finite number of attributes: for a given element, we should recognize values of these attributes. As algorithms for problem solving, we consider decision trees of two types: (4) using only proper hypotheses (an analog of proper equivalence queries from exact learning), and (5) using both attributes and proper hypotheses. As time complexity, we study the depth of decision trees. In the worst case, with the growth of the number of attributes in the problem description, the minimum depth of decision trees of both types either is bounded from above by a constant or grows as a logarithm, or linearly. Based on these results and results obtained in Chap. 7 for attributes and arbitrary hypotheses, we divide the set of all infinite binary information systems into seven complexity classes.

UR - http://hdl.handle.net/10754/686115

UR - https://link.springer.com/10.1007/978-3-031-08585-7_8

U2 - 10.1007/978-3-031-08585-7_8

DO - 10.1007/978-3-031-08585-7_8

M3 - Chapter

SN - 9783031085840

SP - 99

EP - 112

BT - Decision Trees with Hypotheses

PB - Springer International Publishing

ER -