Hello! Installing transformers
without installing torch
completely breaks Prodigy. Running prodigy
command fails with
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/usr/local/lib/python3.11/site-packages/prodigy/__main__.py", line 50, in <module>
main()
File "/usr/local/lib/python3.11/site-packages/prodigy/__main__.py", line 44, in main
controller = run_recipe(run_args)
^^^^^^^^^^^^^^^^^^^^
File "cython_src/prodigy/cli.pyx", line 111, in prodigy.cli.run_recipe
File "cython_src/prodigy/cli.pyx", line 36, in prodigy.cli.get_cli
File "/usr/local/lib/python3.11/site-packages/prodigy/recipes/__init__.py", line 1, in <module>
from . import (
File "/usr/local/lib/python3.11/site-packages/prodigy/recipes/audio.py", line 3, in <module>
from ..components.preprocess import fetch_media as fetch_media_preprocessor
File "cython_src/prodigy/components/preprocess.pyx", line 9, in init prodigy.components.preprocess
File "/usr/local/lib/python3.11/site-packages/spacy_llm/__init__.py", line 1, in <module>
from . import cache # noqa: F401
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/spacy_llm/cache.py", line 11, in <module>
from .ty import PromptTemplateProvider, ShardingLLMTask
File "/usr/local/lib/python3.11/site-packages/spacy_llm/ty.py", line 14, in <module>
from .models import langchain
File "/usr/local/lib/python3.11/site-packages/spacy_llm/models/__init__.py", line 1, in <module>
from .hf import dolly_hf, openllama_hf, stablelm_hf
File "/usr/local/lib/python3.11/site-packages/spacy_llm/models/hf/__init__.py", line 7, in <module>
from .stablelm import stablelm_hf
File "/usr/local/lib/python3.11/site-packages/spacy_llm/models/hf/stablelm.py", line 11, in <module>
class _StopOnTokens(transformers.StoppingCriteria):
File "/usr/local/lib/python3.11/site-packages/spacy_llm/models/hf/stablelm.py", line 13, in _StopOnTokens
self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs
^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'LongTensor'
I don't think that installing any kinds of requirements should impact the way the tool works. Installing transformers
should not out of nowhere require installing torch
.
Thanks!