I want to make sure my BertModel does not loads pre-trained weights. I am using auto class (hugging face) which loads model automatically.
My question is how do I load bert model without pretrained weights?
Use AutoConfig instead of AutoModel:
from transformers import AutoConfig
config = AutoConfig.from_pretrained('bert-base-uncased')
model = AutoModel.from_config(config)
this should set up the model without loading the weights.