Load labels into pipeline running on different compute instance

Hi guys,

I am testing out prodigy and would like to implement it in my own project. I'd ideally like to have an infrastructure where Prodigy is deployed on a VM, and a daily data science pipeline is run from a separate instance (local or container) with the updated labels from Prodigy.

Is there any way to read labels into a python pipeline that is running on a separate instance to the prodigy instance? And then also be able to update the prodigy instance with new labels from that same pipeline?

Thanks a lot !

I have seemed to answer my own query !
By simply changing the database in the ~/.prodigy/prodigy.json file to the common database with my remote instance, I am able to share labels.

Thanks !

1 Like

Happy to hear you got it sorted out :slight_smile:
Have fun annotating! :wink: