Calculate gradients

I'm interested in gathering the loss and the gradient for ner models. The documentation contains a handy function to do this (get_loss), but I guess it may be out-of-date:

I have inspected the code and been able to gather loss, but the gradient remains elusive. I considered gathering the change in model parameters as an indirect way of calculating the gradient, but those were also hard to find.

Any help you could provide would be greatly appreciated! Sincere thanks!

What are you trying to do exactly? If you want the gradient for all the parameters in the model, you can do the following somewhat sneaky thing:

grads = {}
def get_grads(weights, gradients, key=None):
    grads[key] = (weights, gradients)

ner.update(docs, golds, sgd=get_grads)

During the update() method, the model will call into the function you pass in via the sgd keyword. It assumes this function will be an optimizer that performs the weight updates. However, it can be any function. So you can use this to just collect the gradients for your reference. The gradients will be keyed by the ID of the layer, which is pretty opaque. You’ll need to walk the model to find the ID of the layers you’re interested in, probably. If you explain a little bit more about the use-case I can probably provide more help.

Of course, this all refers to internals, subject to change. It’s been fairly stable for a while, but there are details of it I’m not that happy with, so it’ll probably be different in 6-12 months.

I'm interested in the gradient to understand the model and thinc.

Sincere thanks for the quick help.

Your suggestion has helped a great deal! For future reference, I will note that this call does have side effects on the thinc objects. These side-effects can result in different values for the gradients with repeated calculations…