Search code examples
python-3.xtensorflow2.0huggingface-transformers

How to use huggingface T5 model to test translation task?


I see there exits two configs of the T5model - T5Model and TFT5WithLMHeadModel. I want to test this for translation tasks (eg. en-de) as they have shown in the google's original repo. Is there a way I can use this model from hugging face to test out translation tasks. I did not see any examples related to this on the documentation side and was wondering how to provide the input and get the results.

Any help appreciated


Solution

  • You can use T5ForConditionalGeneration to translate your text...

    !pip install transformers
    
    from transformers import T5Tokenizer, T5ForConditionalGeneration
    
    tokenizer = T5Tokenizer.from_pretrained('t5-small')
    
    model = T5ForConditionalGeneration.from_pretrained('t5-small', return_dict=True)
    
    input = "My name is Azeem and I live in India"
    
    # You can also use "translate English to French" and "translate English to Romanian"
    input_ids = tokenizer("translate English to German: "+input, return_tensors="pt").input_ids  # Batch size 1
    
    outputs = model.generate(input_ids)
    
    decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
    
    print(decoded)
    

    As of today, T5WithLMHeadModel is not supported by Transformers.