how to connect llm via Ollama or LangChain ?

will this happen?

Welcome to the forum @dydwgmcnl4241,

spacy-llm is integrated with langchain. You can find some more information in spaCy docs here and here and here you can find an end-to-end example.

As for ollama you can defnitely write your own model wrapper. spacy-llm has been designed to make writing your custom models and tasks as easy as possible. The documentation can be found here