I am testing out prodigy and would like to implement it in my own project. I'd ideally like to have an infrastructure where Prodigy is deployed on a VM, and a daily data science pipeline is run from a separate instance (local or container) with the updated labels from Prodigy.
Is there any way to read labels into a python pipeline that is running on a separate instance to the prodigy instance? And then also be able to update the prodigy instance with new labels from that same pipeline?
Thanks a lot !