Engine downsizing and boosting have been recognized as effective strategies for improving engine efficiency. However, operating the engines at high load promotes abnormal combustion events, such as pre-ignition and potential superknock. Currently the most effective method for detecting pre-ignition is by using in-cylinder pressure sensors that have high precision and sensitivity, but also high cost. Due to rapid advances in automotive technology such as autonomous driving, computer-aided designs and future connectivity, we propose to use a complimentary data-driven strategy for diagnosing abnormal combustion events. To this end, a data-driven diagnostics approach for pre-ignition detection with deep neural networks is proposed. The success of convolutional neural networks (CNNs) in object detection and recurrent neural networks (RNNs) in sequence forecasting inspired us to develop these models for pre-ignition detection. For a cost-effective strategy, we use data from less expensive sensors, such as lambda and low-resolution exhaust back pressure (EBP), instead of high resolution in-cylinder pressure measurements. The first deep learning model is combined with a commonly used dimensionality reduction tool–Principal Component Analysis (PCA). The second model eliminates this step and directly processes time-series data. Results indicate that the first model with reduced input dimensions, and correspondingly smaller size of the network, shows better performance in detecting pre-ignition cycles with an F1 score of 79%. Overall, the proposed deep learning approach is a promising alternative for abnormal combustion diagnostics using data from low resolution sensors.
|Original language||English (US)|
|Journal||Proceedings of the Combustion Institute|
|State||Published - Nov 12 2020|
Bibliographical noteKAUST Repository Item: Exported on 2020-11-18
Acknowledged KAUST grant number(s): OSR-2019-CRG7-4077
Acknowledgements: This work was supported by King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research under the award number OSR-2019-CRG7-4077.