December 24, 2024

How to Fix DLL Dependency Errors (OSError: [WinError 126] fbgemm.dll )- Llamma 3.1 and PyTorch

During the process of installing llama3.1 and PyTorch, the following error will occur


OSError: [WinError 126] The specified module could not be found. Error loading "C:\codes\llama31\Meta-Llama-3.1-8B-Instruct\env1\Lib\site-packages\torch\lib\fbgemm.dll" or one of its dependencies.


This error occurs when we try to install PyTorch and Llama 3.1 in a Python virtual environment.

We performed the following steps

1) Cloned Llama 3.1 model from Hugging Face

2) Created a Python Virtual Environment by using “python -m venv env1”

3) Installed a GPU version of PyTorch by using

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 

Instructions taken from the official website: https://pytorch.org/get-started/locally/

4) Wrote a standard script that should start Llama 3.1, and started the script:

Issue: 
OSError: [WinError 126] The specified module could not be found. Error loading "C:\codes\llama31\Meta-Llama-3.1-8B-Instruct\env1\Lib\site-packages\torch\lib\fbgemm.dll" or one of its dependencies.

In this video tutorial, we explain how to fix this error.