Also props to @justindujardin who implemented the new stream/feed logic
Sorry for the unclear phrasing! But yes, what I meant was a stream generator that just keeps looping over the data and checks against the hashes already present in the database on each iteration. This means that examples that were previously skipped (e.g. because the user hit refresh) are queued up again later. Here's an example implementation: