Deep Learning Framework for Oil Shale Pyrolysis State Recognition Using Bionic Electronic Nose

Abstract Real-time monitoring of the pyrolysis state of oil shale is crucial for precisely controlling heating temperature and duration, which can significantly reduce extraction costs. However, due to the complexity of in-situ environments, this task is highly challenging and remains one of the key...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuping Yuan, Xiaohui Weng, Yuheng Qiao, Xiaohu Shi, Zhiyong Chang
Format: Article
Language:English
Published: Springer 2025-07-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://doi.org/10.1007/s44196-025-00913-5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Real-time monitoring of the pyrolysis state of oil shale is crucial for precisely controlling heating temperature and duration, which can significantly reduce extraction costs. However, due to the complexity of in-situ environments, this task is highly challenging and remains one of the key technological barriers in in-situ mining. To address this issue, this paper proposes an end-to-end recognition technology solution for in-situ pyrolysis state of oil shale using electronic nose. The proposed solution integrates Graph Convolutional Network (GCN) and Long Short-Term Memory (LSTM) to capture the spatial correlations among different sensors in the electronic nose and the temporal characteristics of the data, respectively. It is designed to identify both the pyrolysis state classification and the oil shale maturity regression tasks. Our model achieves 93.87% accuracy on the task of classifying the pyrolysis stage of oil shale; the $$R^2$$ R 2 on the regression task reaches 0.93. To evaluate its effectiveness, we compare its performance with state-of-the-art (SOTA) methods in this field. Experimental results demonstrate the superiority of our proposed framework, highlighting its effectiveness and advantages over existing methods.
ISSN:1875-6883