I have trained the german-model for ORGs. Now the training-curve (see below) shows me that I more data would not improve the model. What other options do I have to improve the model?
Dropout: 0.2 Batch size: 32 Iterations: 10 Samples: 4
% RIGHT WRONG ACCURACY
25% 156 28 0.85 +0.85
50% 160 24 0.87 +0.02
75% 152 32 0.83 -0.04
100% 150 34 0.82 -0.01
Are you using an older version of Prodigy, by any chance? The batch size of 32 is probably too high, I think we changed this default in recent versions. Anyway, try setting a lower batch size, to hopefully get the model to fit better. 4 would be a good thing to try.
Another thing to note is that the training curve can be expected to follow a sigmoid sort of curve. When there’s very little data, accuracy can be flat, as the model fails to generalise usefully. Then as more data is added, the curve enters a high growth section, before tapering off. So sometimes even when the training curve doesn’t show improvement, you do need to add more data.
Try changing the hyper-parameters though, especially the batch size. I think it should help.
Also, another thing to check if you haven’t done so already: Are there any conflicting annotations in your data, or anything else that’s not consistent and might trip up the model? Those things can be kinda subtle sometimes. Essentially, the model will try to make sense of something that… doesn’t actually make sense, and you’ll end up with decreasing accuracy.