我没有使用 Conda,所以其他大线程并没有多大帮助。我安装了 PyTorch:
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124
来自 PyTorch 网站的命令(它适合我的参数),但是当我尝试安装依赖于 Torch 的 Mistra_inference 时,它给了我一个如下所示的错误:
PS C:\Users\admin> pip install mistral_inference
Collecting mistral_inference
Using cached mistral_inference-1.4.0-py3-none-any.whl.metadata (14 kB)
Collecting fire>=0.6.0 (from mistral_inference)
Using cached fire-0.7.0.tar.gz (87 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting mistral_common>=1.4.0 (from mistral_inference)
Using cached mistral_common-1.4.4-py3-none-any.whl.metadata (4.6 kB)
Collecting pillow>=10.3.0 (from mistral_inference)
Using cached pillow-10.4.0-cp312-cp312-win_amd64.whl.metadata (9.3 kB)
Requirement already satisfied: safetensors>=0.4.0 in c:\users\admin\appdata\local\programs\python\python312\lib\site-packages (from mistral_inference) (0.4.4)
Collecting simple-parsing>=0.1.5 (from mistral_inference)
Using cached simple_parsing-0.1.6-py3-none-any.whl.metadata (7.3 kB)
Collecting xformers>=0.0.24 (from mistral_inference)
Using cached xformers-0.0.28.post1.tar.gz (7.8 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "C:\Users\admin\AppData\Local\Programs\Python\Python312\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
main()
File "C:\Users\admin\AppData\Local\Programs\Python\Python312\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\admin\AppData\Local\Programs\Python\Python312\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\admin\AppData\Local\Temp\pip-build-env-zjzruady\overlay\Lib\site-packages\setuptools\build_meta.py", line 332, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\admin\AppData\Local\Temp\pip-build-env-zjzruady\overlay\Lib\site-packages\setuptools\build_meta.py", line 302, in _get_build_requires
self.run_setup()
File "C:\Users\admin\AppData\Local\Temp\pip-build-env-zjzruady\overlay\Lib\site-packages\setuptools\build_meta.py", line 503, in run_setup
super().run_setup(setup_script=setup_script)
File "C:\Users\admin\AppData\Local\Temp\pip-build-env-zjzruady\overlay\Lib\site-packages\setuptools\build_meta.py", line 318, in run_setup
exec(code, locals())
File "<string>", line 24, in <module>
ModuleNotFoundError: No module named 'torch'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
一个月前我已经遇到了这样的问题,但我通过使用 PyTorch 指令和正确的链接正确安装 torch 来修复它(还有一些巫术来替换丢失的文件,但这是一个不同的错误),然后我再次尝试,但它说要求已经满足,所以现在我有点不知道该怎么办。
我通过首先安装 xformers 解决了错误,因为它无法识别已安装的 torch 模块
pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu124
然后尝试安装mistral_inference
pip3 install mistral-inference