Descriptor selection for predicting interfacial thermal resistance by machine learning methods.

Xiaojuan Tian, Mingguang Chen

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


Interfacial thermal resistance (ITR) is a critical property for the performance of nanostructured devices where phonon mean free paths are larger than the characteristic length scales. The affordable, accurate and reliable prediction of ITR is essential for material selection in thermal management. In this work, the state-of-the-art machine learning methods were employed to realize this. Descriptor selection was conducted to build robust models and provide guidelines on determining the most important characteristics for targets. Firstly, decision tree (DT) was adopted to calculate the descriptor importances. And descriptor subsets with topX highest importances were chosen (topX-DT, X = 20, 15, 10, 5) to build models. To verify the transferability of the descriptors picked by decision tree, models based on kernel ridge regression, Gaussian process regression and K-nearest neighbors were also evaluated. Afterwards, univariate selection (UV) was utilized to sort descriptors. Finally, the top5 common descriptors selected by DT and UV were used to build concise models. The performance of these refined models is comparable to models using all descriptors, which indicates the high accuracy and reliability of these selection methods. Our strategy results in concise machine learning models for a fast prediction of ITR for thermal management applications.
Original languageEnglish (US)
JournalScientific reports
Issue number1
StatePublished - Jan 13 2021

Bibliographical note

KAUST Repository Item: Exported on 2021-01-21
Acknowledgements: We gratefully acknowledge the financial support from National Natural Science Foundation of China (No. 21808240).


Dive into the research topics of 'Descriptor selection for predicting interfacial thermal resistance by machine learning methods.'. Together they form a unique fingerprint.

Cite this