The docs that you mention refer to this custom recipe found in a file called transformers_tokenizers.py
on Github. Is this the recipe you were looking for? Let me know if that's not the case. This recipe is meant for users who want to train their own huggingface models without spaCy in the mix.
Note that if you're interested in using spaCy with transformers that you can also just use the standard ner.manual
recipe. I wrote a bit more context about that here:
In case it's relevant: more information on the user of transformer models in spaCy can be found on this Github FAQ.