Flash Attention (flash_attn) 包由于子进程退出并出现错误而无法在 Google Colab 中构建 Wheel

问题描述 投票:0回答:1

我正在尝试在 Google Colab 上运行 Florence-2 模型,这需要

flash_attn
包。以前,模型和包在 Colab 上安装没有任何问题,但现在我在尝试安装时遇到以下错误
flash_attn

Building wheels for collected packages: flash_attn
  error: subprocess-exited-with-error

  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

  note: This error originates from a subprocess, and is likely not a problem with pip.
  Building wheel for flash_attn (setup.py) ... error
  ERROR: Failed building wheel for flash_attn
  Running setup.py clean for flash_attn
Failed to build flash_attn
ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash_attn).

我已经尝试过以下方法但失败了:

  1. 升级
    pip
    wheel
    setuptools
!pip install --upgrade pip wheel setuptools

2.在Colab中创建虚拟环境并安装

flash_attn
3.寻找
flash_attn
及其依赖项的兼容版本,但无法查明任何版本冲突。

由于

flash_attn
是一个关键依赖项,我怀疑最近的 Colab 更新或底层库可能存在版本兼容性问题,但我找不到任何详细的版本要求或解决方法。有其他人遇到过这个问题吗?或者有人知道兼容版本或其他解决方案吗?

感谢您帮助解决此安装问题!

python google-colaboratory flash-attn
1个回答
0
投票

根据https://github.com/huggingface/transformers/issues/34466#issuecomment-2442180500,你需要降级PyTorch 2.4,否则2.5需要几个小时

!pip install torch=='2.4.1+cu121' torchvision=='0.19.1+cu121' torchaudio=='2.4.1+cu121' --index-url https://download.pytorch.org/whl/cu121
© www.soinside.com 2019 - 2024. All rights reserved.