train textcat to train my classifiers. This works fine using a standard spaCy base model. But when I try with
en_trf_xlnetbasecased_lg I get an error:
File "/mnt/c/repo/prodigy/recipes/dacs.py", line 407, in train result = train(**args) File "/usr/local/lib/python3.7/site-packages/prodigy/recipes/train.py", line 154, in train nlp.update(docs, annots, drop=dropout, losses=losses) File "/usr/local/lib/python3.7/site-packages/spacy_transformers/language.py", line 81, in update tok2vec = self.get_pipe(PIPES.tok2vec) File "/usr/local/lib/python3.7/site-packages/spacy/language.py", line 281, in get_pipe raise KeyError(Errors.E001.format(name=name, opts=self.pipe_names)) KeyError: "[E001] No component 'trf_tok2vec' found in pipeline. Available names: ['textcat']"
I think this use case is supposed to work, right?