I want to download the GPT-2 model and tokeniser. For open-end generation, HuggingFace sets the padding token ID to be equal to the end-of-sentence token ID, so I configured it manually using :
import tensorflow as tf
from transformers import TFGPT2LMHeadModel, GPT2Tokenizer
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
model = TFGPT2LMHeadModel.from_pretrained("gpt2", pad_token_id=tokenizer.eos_token_id)
However, it gives me the following error:
TypeError: ('Keyword argument not understood:', 'pad_token_id')
I haven't been able to find a solution for this nor do I understand why I am getting this error. Insights will be appreciated.
Your code does not throw any error for me - I would try re-installing the most recent version of transformers
- if that is a viable solution for you.