Search code examples
pythonpytorchhuggingface-transformersgpt-3

AttributeError: module transformers has no attribute TFGPTNeoForCausalLM


I cloned this repository/documentation https://huggingface.co/EleutherAI/gpt-neo-125M

I get the below error whether I run it on google collab or locally. I also installed transformers using this

pip install git+https://github.com/huggingface/transformers

and made sure the configuration file is named as config.json

      5 tokenizer = AutoTokenizer.from_pretrained("gpt-neo-125M/",from_tf=True)
----> 6 model = AutoModelForCausalLM.from_pretrained("gpt-neo-125M",from_tf=True)
      7 
      8 

3 frames
/usr/local/lib/python3.7/dist-packages/transformers/file_utils.py in __getattr__(self, name)

AttributeError: module transformers has no attribute TFGPTNeoForCausalLM

Full code:

from transformers import AutoTokenizer, AutoModelForCausalLM 

tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M",from_tf=True)

model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-neo-125M",from_tf=True)

transformers-cli env results:

  • transformers version: 4.10.0.dev0
  • Platform: Linux-4.4.0-19041-Microsoft-x86_64-with-glibc2.29
  • Python version: 3.8.5
  • PyTorch version (GPU?): 1.9.0+cpu (False)
  • Tensorflow version (GPU?): 2.5.0 (False)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Both collab and locally have TensorFlow 2.5.0 version


Solution

  • My solution was to first edit the source code to remove the line that adds "TF" in front of the package as the correct transformers module is GPTNeoForCausalLM , but somewhere in the source code it manually added a "TF" in front of it.

    Secondly, before cloning the repository it is a must to run

     git lfs install. 
    

    This link helped me install git lfs properly https://askubuntu.com/questions/799341/how-to-install-git-lfs-on-ubuntu-16-04