UNet改进(22):融合CNN与Transformer的医学图像分割新架构
1. 混合注意力机制设计
1.1 HybridAttention模块概述
HybridAttention模块是我们架构的核心创新点,它巧妙地将CNN的通道注意力和Transformer的空间注意力结合起来:
class HybridAttention(nn.Module):"""Hybrid Attention combining CNN and Transformer attention"""def __init__(self, channels, reduction_ratio=8, num_heads=8):super(HybridAttention, self).__init__()self.channels = channels# CNN attention branch (channel attention)self.cnn_attention = nn.Sequential(nn.AdaptiveAvgPool2d(1),nn.Conv2d(channels, channels // reduction_ratio, kernel_size=1),nn.ReLU(inplace=True),nn.Conv2d(channels // reduction_ratio, channels, kernel_size=1