基于Transformer与注意力机制的肺部肿瘤分割方法

    Lung Tumor Segmentation Method Based on Transformer and Attention Mechanisms

    • 摘要: 肺部肿瘤的准确分割对于肿瘤的诊断和治疗具有重要作用,然而肺部肿瘤分割中存在病灶与周围组织的对比度低、肿瘤与正常组织易粘连和背景噪声大等问题。针对这些问题,本文提出了一种基于Transformer和注意力机制的肺部肿瘤分割方法。在Transformer编码器阶段引入全局和局部的注意力机制,使得网络可以同时关注全局和局部的上下文信息;在跳跃连接阶段,使用通道优先卷积注意力机制,可以增强复杂病灶的空间感知能力和降低通道维度冗余,从而提高肿瘤的分割精度。在私有数据集GDPH和公共数据集LUNG1上的测试结果表明,本文方法相比其他8种分割方法,Dice指标在两个数据集上表现最优,分别为90.96%和88.18%,可以为临床的诊疗提供可靠辅助。

       

      Abstract: The accurate segmentation of lung tumors plays a crucial role in tumor diagnosis and treatment. However, lung tumor segmentation is often challenged by several issues such as low contrast between lesions and surrounding tissues, tumor-normal tissue adhesion, and high background noise. To address these, this study introduces a lung tumor segmentation method based on Transformer and attention mechanisms. In the Transformer encoder stage, both global and local attention mechanisms are incorporated to enable the network to simultaneously focus on both global and local contextual information. In the skip connection stage, a channel-prior convolutional attention mechanism is utilized to enhance the spatial perception ability for complex lesions and reduce the channel dimension redundancy, such that the tumor segmentation accuracy can be improved. The experimental results on the private GDPH and public LUNG1 datasets demonstrate that the proposed method outperforms eight comparative methods in terms of the Dice metric by achieving approximately 90.96% and 88.18% on the two datasets, respectively. The proposed method can provide reliable assistance for clinical diagnosis and treatment.

       

    /

    返回文章
    返回