I am behind firewall, and have a very limited access to outer world from my server. I wanted to load huggingface model/resource from local disk.
from sentence_transformers import SentenceTransformer
# initialize sentence transformer model
# How to load 'bert-base-nli-mean-tokens' from local disk?
model = SentenceTransformer('bert-base-nli-mean-tokens')
# create sentence embeddings
sentence_embeddings = model.encode(sentences)
I came across some comments about
load_pretrained()
, etc. However, could not get the above problem sorted. Any suggestion is welcome. Thank you in advance.
First, clone the model you want to load with git clone
In your example:
git clone https://huggingface.co/sentence-transformers/bert-base-nli-mean-tokens
You can of course download it from another PC and pass it, to avoid the firewall problem.
After that, simply replace the name of the model with the path of the file you just downloaded:
from sentence_transformers import SentenceTransformer
# initialize sentence transformer model
# How to load 'bert-base-nli-mean-tokens' from local disk?
model = SentenceTransformer('/path/to/cloned/git/repo')
# create sentence embeddings
sentence_embeddings = model.encode(sentences)
Side note: as mentioned here:
This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: SBERT.net - Pretrained Models