【运维】flash attention安装出现错误及编译慢
错误为:ImportError: /lib64/libc.so.6: version `GLIBC_2.32' not found
去https://github.com/Dao-AILab/flash-attention/releases?page=1
下载flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
然后pip
python -c "import torch;print(torch._C._GLIBCXX_USE_CXX11_ABI)"查看是否用abi
参考
https://github.com/Dao-AILab/flash-attention/issues/1708