I'm working on a project that use sentence-transformers 2.2.2 and if I create the virtual environment with
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
The application run without any problem.
I'm converting the project to Poetry but when I create the virtual environment using Poetry and the I run the application I have this error:
Traceback (most recent call last):
File "app/main.py", line 12, in <module>
from app.src.common.utils import get_categories, get_models
File "/home/mdonato/Documents-backup/jobs-data-analysis/app/src/common/utils.py", line 10, in <module>
from sentence_transformers import SentenceTransformer
File "/home/mdonato/Documents-backup/jobs-data-analysis/.venv/lib/python3.8/site-packages/sentence_transformers/__init__.py", line 3, in <module>
from .datasets import SentencesDataset, ParallelSentencesDataset
File "/home/mdonato/Documents-backup/jobs-data-analysis/.venv/lib/python3.8/site-packages/sentence_transformers/datasets/__init__.py", line 1, in <module>
from .DenoisingAutoEncoderDataset import DenoisingAutoEncoderDataset
File "/home/mdonato/Documents-backup/jobs-data-analysis/.venv/lib/python3.8/site-packages/sentence_transformers/datasets/DenoisingAutoEncoderDataset.py", line 1, in <module>
from torch.utils.data import Dataset
File "/home/mdonato/Documents-backup/jobs-data-analysis/.venv/lib/python3.8/site-packages/torch/__init__.py", line 229, in <module>
from torch._C import * # noqa: F403
ImportError: libcupti.so.11.7: cannot open shared object file: No such file or directory
I followed this guide for the instruction of the NVIDIA driver, CUDA and cuDNN. I also followed the Tar file installation so I tried to install in both ways but I'm still having the same error.
There is a similar problem here that is solved fixing torch<2.0.1
. The same solution works for me while building an inference docker image of the Nvidia base image.
Your package have no upper limit for torch version. You are probably installing 2.0.1
which also gave me the error you are seeing.