2023-01-25 08:21:21,659 - ERROR - Traceback (most recent call last):
File "/home/xyzUser/project/queue_handler/document_queue_listner.py", line 148, in __process_and_acknowledge
pipeline_result = self.__process_document_type(message, pipeline_input)
File "/home/xyzUser/project/queue_handler/document_queue_listner.py", line 194, in __process_document_type
pipeline_result = bill_parser_pipeline.process(pipeline_input)
File "/home/xyzUser/project/main/billparser/__init__.py", line 18, in process
bill_extractor_model = MachineGeneratedBillExtractorModel()
File "/home/xyzUser/project/main/billparser/models/qa_model.py", line 25, in __new__
cls.__model = TransformersReader(model_name_or_path=cls.__model_path, use_gpu=False)
File "/home/xyzUser/project/.env/lib/python3.8/site-packages/haystack/nodes/base.py", line 48, in wrapper_exportable_to_yaml
init_func(self, *args, **kwargs)
File "/home/xyzUser/project/.env/lib/python3.8/site-packages/haystack/nodes/reader/transformers.py", line 93, in __init__
self.model = pipeline(
File "/home/xyzUser/project/.env/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 542, in pipeline
return task_class(model=model, framework=framework, task=task, **kwargs)
File "/home/xyzUser/project/.env/lib/python3.8/site-packages/transformers/pipelines/question_answering.py", line 125, in __init__
super().__init__(
File "/home/xyzUser/project/.env/lib/python3.8/site-packages/transformers/pipelines/base.py", line 691, in __init__
self.device = device if framework == "tf" else torch.device("cpu" if device < 0 else f"cuda:{device}")
TypeError: '<' not supported between instances of 'torch.device' and 'int'
This is the error message i got after installing a requirement.txt file from my project. I think it is related to torch but also dont know how to fix it. I am new to hugging face transformers and dont know if it is a version issue.
This was a bug with the transformers
package for a number of versions prior to v4.22.0
, given that particular line of code does not discern between the type of the device
argument could be a torch.device
before comparing that with an int
. Tracing through git blame
, we can find that this specific change made in changeset 9d4a45509ab
include the much needed if isinstance(device, torch.device):
provided by line 764 in the resulting file, which will ensure this error won't happen. Checking the tags above will show that the release for v4.22.0
and after should include this particular fix. As a refresher, to update a specific package, activate the environment, and issue the following:
pip install -U transformers
Alternatively with a specific version, e.g.:
pip install -U transformers==4.22.0