8 / 2025-03-17 22:35:21
Improvement of lattice Boltzmann methods with attention and convolutional neural networks
lattice boltzmann method,attention mechanism,Neural network,parallel computing
摘要录用
Wei Li / Jiangsu University
Hao Liu / Jiangsu University
Traditional Lattice Boltzmann Methods (LBM), constrained by localized collision-streaming rules, face challenges in modeling multiscale flow interactions such as vortex merging and pressure wave propagation. We propose a hybrid deep learning framework that synergizes convolutional neural networks (CNNs) and self-attention mechanisms to augment LBM. The CNNs compresses distribution functions into low-dimensional features, and attention weights are scaled by local vorticity magnitude. The method is validated on two-dimensional Poiseuille flow and flow around cylinders. From the results, this method achieves faster convergence to the steady state and higher accuracy in predicting vortex shedding frequencies compared to standard LBM. This vorticity-guided attention mechanism enables LBM to resolve long-range flow interactions without sacrificing parallel efficiency, showing promise for complex flow problems. Overall, this research aims to contribute to advancing LBM in High-Performance Computation.

 
重要日期
  • 会议日期

    07月03日

    2025

    07月06日

    2025

  • 06月25日 2025

    初稿截稿日期

主办单位
Harbin Engineering University, China
承办单位
Harbin Engineering University, China
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询