Search code examples
pytorchbert-language-modelhuggingface-transformers

Make sure BERT model does not load pretrained weights?


I want to make sure my BertModel does not loads pre-trained weights. I am using auto class (hugging face) which loads model automatically.

My question is how do I load bert model without pretrained weights?


Solution

  • Use AutoConfig instead of AutoModel:

    from transformers import AutoConfig
    config = AutoConfig.from_pretrained('bert-base-uncased')
    model =  AutoModel.from_config(config)
    

    this should set up the model without loading the weights.

    Documentation here and here