Out-of-vocabulary new NER model

We’re able to learn new vocabulary items without resizing the embedding table. This is one of the big advantages of the hash embeddings used in spaCy. I explain it here: Can you explain how exactly HashEmbed works ?