Yes, that should be no problem, either – streams are generators and Prodigy will only ask for one batch at a time. Just make sure you're streaming data in a format that can be read line by line (otherwise, the whole file has to be parsed and read into memory) and that you don't confuse Python if it has unclosed files etc.
Dependin on how your data is stored, you could also write a custom loader: https://prodi.gy/docs/api-loaders#loaders-custom
And if you're not doing any active learning etc., you can also set
"force_stream_order": true in your recipe
"config" to make sure examples are always sent out in order and Prodigy keeps sending out unanswered batches until they're answered and doesn't wait for them to come back, even if you refresh the app a lot.