Decision trees play a very important role in knowledge representation because of its simplicity and self-explanatory nature. We study the optimization of the parameters of the decision trees to find a shorter as well as more accurate decision tree. Since these two criteria are in conflict, we need to find a decision tree with suitable parameters that can be a trade off between two criteria. Hence, we design two algorithms to build a decision tree with a given threshold of the number of vertices based on the bi-criteria optimization technique. Then, we calculate the local and global misclassification rates for these trees. Our goal is to study the effect of changing the threshold for the bi-criteria optimization of the decision trees. We apply our algorithms to 13 decision tables from UCI Machine Learning Repository and recommend the suitable threshold that can give us more accurate decision trees with a reasonable number of vertices.