Search code examples
huggingface-transformers

Where does hugging face's transformers save models?


Running the below code downloads a model - does anyone know what folder it downloads it to?

!pip install -q transformers
from transformers import pipeline
model = pipeline('fill-mask')

Solution

  • Update 2023-05-02: The cache location has changed again, and is now ~/.cache/huggingface/hub/, as reported by @Victor Yan. Notably, the sub folders in the hub/ directory are also named similar to the cloned model path, instead of having a SHA hash, as in previous versions.


    Update 2021-03-11: The cache location has now changed, and is located in ~/.cache/huggingface/transformers, as it is also detailed in the answer by @victorx.


    This post should shed some light on it (plus some investigation of my own, since it is already a bit older).

    As mentioned, the default location in a Linux system is ~/.cache/torch/transformers/ (I'm using transformers v 2.7, currently, but it is unlikely to change anytime soon.). The cryptic folder names in this directory seemingly correspond to the Amazon S3 hashes.

    Also note that the pipeline tasks are just a "rerouting" to other models. To know which one you are currently loading, see here. For your specific model, pipeline(fill-mask) actually utilizes a distillroberta-base model.