Search code examples
tensorflow2.0torchhuggingface-transformershuggingface-tokenizershuggingface

Getting an error install a package on the Terminal to use Hugging Face In VS Cod


I am using the steps from the Hugging Face website (https://huggingface.co/docs/transformers/installation) in order to start using hugging face in Visual Studio Code and install all the transformers.

I was on the last process, where I had to type "pip install transformers[flax]", then I got an error, so I installed rust-land, however, I still ended up getting an error;

Requirement already satisfied: transformers[flax] in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (4.22.2)
Requirement already satisfied: filelock in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (3.8.0)
Requirement already satisfied: requests in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (2.28.1)
Requirement already satisfied: tokenizers!=0.11.3,<0.13,>=0.11.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (0.12.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.9.0 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (0.10.0)
Requirement already satisfied: packaging>=20.0 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (21.3)
Requirement already satisfied: tqdm>=4.27 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (4.64.1)
Requirement already satisfied: regex!=2019.12.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from 
transformers[flax]) (2022.9.13)
Requirement already satisfied: numpy>=1.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (1.23.3)
Requirement already satisfied: pyyaml>=5.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (6.0)
Collecting transformers[flax]
  Using cached transformers-4.22.1-py3-none-any.whl (4.9 MB)
  Using cached transformers-4.22.0-py3-none-any.whl (4.9 MB)
  Using cached transformers-4.21.3-py3-none-any.whl (4.7 MB)
  Using cached transformers-4.21.2-py3-none-any.whl (4.7 MB)
  Using cached transformers-4.21.1-py3-none-any.whl (4.7 MB)
  Using cached transformers-4.21.0-py3-none-any.whl (4.7 MB)
  Using cached transformers-4.20.1-py3-none-any.whl (4.4 MB)
  Using cached transformers-4.20.0-py3-none-any.whl (4.4 MB)
  Using cached transformers-4.19.4-py3-none-any.whl (4.2 MB)
  Using cached transformers-4.19.3-py3-none-any.whl (4.2 MB)
  Using cached transformers-4.19.2-py3-none-any.whl (4.2 MB)
  Using cached transformers-4.19.1-py3-none-any.whl (4.2 MB)
  Using cached transformers-4.19.0-py3-none-any.whl (4.2 MB)
  Using cached transformers-4.18.0-py3-none-any.whl (4.0 MB)
Collecting sacremoses
  Using cached sacremoses-0.0.53-py3-none-any.whl
Collecting jax!=0.3.2,>=0.2.8
  Using cached jax-0.3.21.tar.gz (1.1 MB)
  Preparing metadata (setup.py) ... done
Collecting flax>=0.3.5
  Using cached flax-0.6.1-py3-none-any.whl (185 kB)
Collecting optax>=0.0.8
  Using cached optax-0.1.3-py3-none-any.whl (145 kB)
Collecting transformers[flax]
  Using cached transformers-4.17.0-py3-none-any.whl (3.8 MB)
  Using cached transformers-4.16.2-py3-none-any.whl (3.5 MB)
  Using cached transformers-4.16.1-py3-none-any.whl (3.5 MB)
  Using cached transformers-4.16.0-py3-none-any.whl (3.5 MB)
  Using cached transformers-4.15.0-py3-none-any.whl (3.4 MB)
Collecting tokenizers<0.11,>=0.10.1
  Using cached tokenizers-0.10.3.tar.gz (212 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
  Using cached transformers-4.14.1-py3-none-any.whl (3.4 MB)
  Using cached transformers-4.13.0-py3-none-any.whl (3.3 MB)
  Using cached transformers-4.12.5-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.12.4-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.12.3-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.12.2-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.12.1-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.12.0-py3-none-any.whl (3.1 MB)
  Using cached transformers-4.11.3-py3-none-any.whl (2.9 MB)
  Using cached transformers-4.11.2-py3-none-any.whl (2.9 MB)
  Using cached transformers-4.11.1-py3-none-any.whl (2.9 MB)
  Using cached transformers-4.11.0-py3-none-any.whl (2.9 MB)
  Using cached transformers-4.10.3-py3-none-any.whl (2.8 MB)
  Using cached transformers-4.10.2-py3-none-any.whl (2.8 MB)
  Using cached transformers-4.10.1-py3-none-any.whl (2.8 MB)
  Using cached transformers-4.10.0-py3-none-any.whl (2.8 MB)
  Using cached transformers-4.9.2-py3-none-any.whl (2.6 MB)
Collecting huggingface-hub==0.0.12
  Using cached huggingface_hub-0.0.12-py3-none-any.whl (37 kB)
Collecting transformers[flax]
  Using cached transformers-4.9.1-py3-none-any.whl (2.6 MB)
  Using cached transformers-4.9.0-py3-none-any.whl (2.6 MB)
  Using cached transformers-4.8.2-py3-none-any.whl (2.5 MB)
  Using cached transformers-4.8.1-py3-none-any.whl (2.5 MB)
  Using cached transformers-4.8.0-py3-none-any.whl (2.5 MB)
  Using cached transformers-4.7.0-py3-none-any.whl (2.5 MB)
Collecting huggingface-hub==0.0.8
  Using cached huggingface_hub-0.0.8-py3-none-any.whl (34 kB)
Collecting transformers[flax]
  Using cached transformers-4.6.1-py3-none-any.whl (2.2 MB)
  Using cached transformers-4.6.0-py3-none-any.whl (2.3 MB)
  Using cached transformers-4.5.1-py3-none-any.whl (2.1 MB)
  Using cached transformers-4.5.0-py3-none-any.whl (2.1 MB)
  Using cached transformers-4.4.2-py3-none-any.whl (2.0 MB)
  Using cached transformers-4.4.1-py3-none-any.whl (2.1 MB)
  Using cached transformers-4.4.0-py3-none-any.whl (2.1 MB)
  Using cached transformers-4.3.3-py3-none-any.whl (1.9 MB)
  Using cached transformers-4.3.2-py3-none-any.whl (1.8 MB)
  Using cached transformers-4.3.1-py3-none-any.whl (1.8 MB)
  Using cached transformers-4.3.0-py3-none-any.whl (1.8 MB)
  Using cached transformers-4.2.2-py3-none-any.whl (1.8 MB)
Collecting tokenizers==0.9.4
  Using cached tokenizers-0.9.4.tar.gz (184 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
  Using cached transformers-4.2.1-py3-none-any.whl (1.8 MB)
  Using cached transformers-4.2.0-py3-none-any.whl (1.8 MB)
  Using cached transformers-4.1.1-py3-none-any.whl (1.5 MB)
  Using cached transformers-4.1.0-py3-none-any.whl (1.5 MB)
  Using cached transformers-4.0.1-py3-none-any.whl (1.4 MB)
Collecting flax==0.2.2
  Using cached flax-0.2.2-py3-none-any.whl (148 kB)
Collecting transformers[flax]
  Using cached transformers-4.0.0-py3-none-any.whl (1.4 MB)
  Using cached transformers-3.5.1-py3-none-any.whl (1.3 MB)
Requirement already satisfied: protobuf in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (3.19.6)
Collecting sentencepiece==0.1.91
  Using cached sentencepiece-0.1.91.tar.gz (500 kB)
  Preparing metadata (setup.py) ... done
Collecting tokenizers==0.9.3
  Using cached tokenizers-0.9.3.tar.gz (172 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
  Using cached transformers-3.5.0-py3-none-any.whl (1.3 MB)
  Using cached transformers-3.4.0-py3-none-any.whl (1.3 MB)
Collecting tokenizers==0.9.2
  Using cached tokenizers-0.9.2.tar.gz (170 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting sentencepiece!=0.1.92
  Using cached sentencepiece-0.1.97-cp310-cp310-win_amd64.whl (1.1 MB)
Collecting transformers[flax]
  Using cached transformers-3.3.1-py3-none-any.whl (1.1 MB)
WARNING: transformers 3.3.1 does not provide the extra 'flax'
Collecting tokenizers==0.8.1.rc2
  Using cached tokenizers-0.8.1rc2.tar.gz (97 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: colorama in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from tqdm>=4.27->transformers[flax]) (0.4.5)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from packaging>=20.0->transformers[flax]) (3.0.9)
Requirement already satisfied: idna<4,>=2.5 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (3.4)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (1.26.12)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (2022.9.24)
Collecting joblib
  Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Requirement already satisfied: six in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from sacremoses->transformers[flax]) (1.16.0)
Collecting click
  Using cached click-8.1.3-py3-none-any.whl (96 kB)
Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [48 lines of output]
      C:\Users\user\AppData\Local\Temp\pip-build-env-hhrbpvks\overlay\Lib\site-packages\setuptools\dist.py:530: UserWarning: Normalizing '0.8.1.rc2' to '0.8.1rc2'
        warnings.warn(tmpl.format(**locals()))
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-310
      creating build\lib.win-amd64-cpython-310\tokenizers
      copying tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\models
      copying tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models
      creating build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      creating build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\processors
      copying tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors
      creating build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      creating build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations       
      copying tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations       
      copying tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations       
      copying tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations       
      copying tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations    
      copying tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers
      copying tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models
      copying tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors
      copying tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      running build_ext
      running build_rust
      error: can't find Rust compiler
     
      If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
     
      To update pip, run:
     
          pip install --upgrade pip
     
      and then retry package installation.
     
      If you did intend to build this package from source, try installing a Rust compiler from your system package manager and 
ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to 
download and update the Rust compiler toolchain.
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

Do you know how I can successfully install this into VS Code and use Hugging Face properly?


Solution

  •    If you did intend to build this package from source, try installing a Rust compiler from your system package manager and 
    ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to 
    download and update the Rust compiler toolchain.
          [end of output]
    

    That's the primary error that you're having. You're going to need to install the rust-lang compiler in order to finish the install.