I have couple of questions regarding using
pretrain command in spacy.
Does the dimension of the vector model matters?
I saw this line in https://github.com/explosion/spaCy/blob/16aa092fb5cffb5ec7079951ea0c04cb96733b3e/spacy/cli/pretrain.py#L318
Maxoutlayer has output dim as 300 (if I understand here correctly). A interesting thing is for the language model
en_vectors_web_lg, the vector size is also 300. Is this a coincidence or some specific design?
If I use
nlp = spacy.load('en_core_web_lg'), I think this should also be counted as using "pretrained" model, is that right? Since
en_vectors_web_lghave the same vector (all from Glove, if I understand it correctly), then what is the difference between using
nlp = spacy.load('en_core_web_lg')and use a