texcat.correct "TypeError: Cannot read properties of undefined (reading 'reject')"

Hi,

I'm trying to use textcat.correct in the following way

prodigy textcat.correct <MY_DATASET> ./model/model-best data/raw.jsonl --label <MY_LABEL>

I get

"TypeError: Cannot read properties of undefined (reading 'reject')"

when I launch the web app.

The dataset has been labelled with single label. It was done with an earlier version of Prodigy.

A typical example looks like

{"text":"SOME TEXT","meta":{METADATA},"label":"<MY_LABEL>","_input_hash":-1524754645,"_task_hash":735813869,"_session_id":null,"_view_id":"classification","answer":"accept"}

Perhaps the label format has changed with the upgrade? I know for example I have to use textcat_multilabel when training now because for binary it expects LABEL,NOT_LABEL

Hi! This is strange and I wonder if the UI somehow gets tripped up by the pre-defined annotations in the data :thinking: I need to try and see if I can reproduce this!

If you're using textcat.correct, you'll only see the model's predictions anyway and the pre-set annotations should be ignored/discarded. So as a workaround, could you try and remove the "label" (or everything except for the "text" and meta, really) from your input data?

So, the JSONL file that I'm inputting doesn't have any "label" fields. The example I showed above is actually from the the dataset that already in the db. But your response made me try something else: if I use textcat.correct to create a NEW dataset it works fine, but I can't add it my existing dataset.

I actually had previously created a custom recipe for this purpose which filtered the stream based on high/low model scores to help me find positives/negatives in imbalanced data. In that case I just made it work like textcat.teach and appended to the dataset. I though using the exclude parameter here would allow me to do the same.

Anyway, if I want to use the built in version, I can create a new dataset using textcat.correct and exclude the existing dataset, and then merge them afterwards.