49 / 2025-03-29 16:14:55
Quality Relevance-Aware Multi-Head Attention for Transformers in Industrial Time-Series Prediction
multi-head attention, transformer, soft sensor modeling, industrial processes
全文待审
梓怡 杨 / 中南大学
Key quality prediction is crucial in the process industry for effective process control and monitoring. Recently, Transformer-based models have demonstrated remarkable success due to their powerful multi-head attention mechanism for sequence modeling. However, the conventional approach of handling multiple attention heads fails to assess their importance in feature extraction, leading to redundancy from non-essential information. To address this issue, this paper introduces a novel Quality Relevance-Aware Multi-Head Attention for Transformer (QR-Former). First, the multi-head attention mechanism extracts features from the input sequence. Then, for each attention head, its outputs across all samples form multivariate feature time sequences. To establish relationships with the quality sequence, we compute the similarity between each univariate feature time sequence within a head and the corresponding quality label sequence. This enables us to quantify the importance of each head based on the overall similarity of its output feature dimensions. The extracted features from different heads are then weighted and concatenated for further modeling within the Transformer structure. QR-Former is applied to quality prediction for two datasets from an industrial hydrocracking process. Its effectiveness is validated through comparisons with mainstream Transformer-based methods.
重要日期
  • 会议日期

    08月22日

    2025

    08月24日

    2025

  • 04月25日 2025

    初稿截稿日期

主办单位
中国自动化学会技术过程的故障诊断与安全性专业委员会
承办单位
新疆大学
新疆自动化学会
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询