I am trying to train a multilabel classifier with the command
prodigy train --textcat-multilabel my_data --label-stats True --base-model en_core_web_lg --lang "en" --eval-split 0.2 >> ./train_logs/mylog.log
as I understand , this should create a default
config file, after which training starts.
The problem I am having is that training does not seem to stop, as shown in the logs:
E # LOSS TOK2VEC LOSS TEXTC... CATS_SCORE SPEED SCORE
--- ------ ------------ ------------- ---------- ------ ------
0 0 0.00 0.61 37.34 3284.26 0.37
3 1000 0.00 702.23 44.16 3350.89 0.44
...
1543 67000 0.00 0.64 53.53 3238.98 0.54
1567 68000 0.00 0.66 53.52 3310.02 0.54
Can someone please point me to what parameter I should be using that would allow me to stop training after a certain number of iterations?
I have played around with the below as well
prodigy train --textcat-multilabel match_eq_3_label_prodigy_input --label-stats True --base-model en_core_web_lg --lang "en" --config test_config.cfg
ℹ Using CPU
========================= Generating Prodigy config =========================
ℹ Using config from base model
✔ Generated training config
where i edit the training parameters as such:
[training]
max_epochs = 1
max_steps = 1000
but these dont seem to give the desired effect.
Any suggestions would be most welcome.
Thank you!!