Getting "generator already executing"


I am getting the following traceback when using the textcat recipe:

$ prodigy textcat.teach test_data en_core_web_sm --label DISCOUNT
Using 1 label(s): DISCOUNT

✨  Starting the web server at ...
Open the app in your browser and start annotating!

Exception ignored in: <generator object at 0x7f2a6787f168>
ValueError: generator already executing
Task exception was never retrieved
future: <Task finished coro=<RequestResponseCycle.run_asgi() done, defined at /opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/> exception=ValueError('generator already executing',)>
Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/", line 388, in run_asgi
    self.logger.error(msg, exc_info=exc)
  File "/opt/conda/lib/python3.6/logging/", line 1337, in error
    self._log(ERROR, msg, args, **kwargs)
  File "/opt/conda/lib/python3.6/logging/", line 1444, in _log
  File "/opt/conda/lib/python3.6/logging/", line 1453, in handle
    if (not self.disabled) and self.filter(record):
  File "/opt/conda/lib/python3.6/logging/", line 720, in filter
    result = f.filter(record)
  File "cython_src/prodigy/util.pyx", line 120, in prodigy.util.ServerErrorFilter.filter
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/", line 385, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/middleware/", line 45, in __call__
    return await, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/fastapi/", line 142, in __call__
    await super().__call__(scope, receive, send)  # pragma: no cover
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 134, in __call__
    await self.error_middleware(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/", line 178, in __call__
    raise exc from None
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/", line 156, in __call__
    await, receive, _send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/", line 76, in __call__
    await, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 73, in __call__
    raise exc from None
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 62, in __call__
    await, receive, sender)
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 590, in __call__
    await route(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 208, in __call__
    await, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 41, in app
    response = await func(request)
  File "/opt/conda/lib/python3.6/site-packages/fastapi/", line 129, in app
    raw_response = await run_in_threadpool(, **values)
  File "/opt/conda/lib/python3.6/site-packages/starlette/", line 25, in run_in_threadpool
    return await loop.run_in_executor(None, func, *args)
  File "/opt/conda/lib/python3.6/concurrent/futures/", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/opt/conda/lib/python3.6/site-packages/prodigy/", line 389, in get_questions
    return _shared_get_questions(None)
  File "/opt/conda/lib/python3.6/site-packages/prodigy/", line 370, in _shared_get_questions
    tasks = controller.get_questions(session_id=session_id, excludes=excludes)
  File "cython_src/prodigy/core.pyx", line 138, in prodigy.core.Controller.get_questions
  File "cython_src/prodigy/components/feeds.pyx", line 68, in prodigy.components.feeds.SharedFeed.get_questions
  File "cython_src/prodigy/components/feeds.pyx", line 73, in prodigy.components.feeds.SharedFeed.get_next_batch
  File "cython_src/prodigy/components/feeds.pyx", line 153, in prodigy.components.feeds.SessionFeed.get_session_stream
  File "cython_src/prodigy/components/feeds.pyx", line 135, in prodigy.components.feeds.SessionFeed.validate_stream
  File "/opt/conda/lib/python3.6/site-packages/toolz/", line 376, in first
    return next(iter(seq))
ValueError: generator already executing

Seems like the error comes from the /get_questions endpoint. In the browser, the UI takes a really long time to load and then I get a 500 "Internal Server Error" message. Any ideas?

Hi! I think the problem here is that your command is not specifying an input data source as the third argument, after the model:

$ prodigy textcat.teach test_data en_core_web_sm --label DISCOUNT
$ prodigy textcat.teach test_data en_core_web_sm ./path/to/data.jsonl --label DISCOUNT

If you leave out the data source, Prodigy will read from standard input (so you can pipe data forward to a recipe on the command line). So basically, it's waiting for something to come in on standard input, which never happens, and then it eventually dies.

(In the future, we want to make the standard input behaviour more explicit by requiring to set the source argument to -, like in other command-line tools. But this would be a breaking change.)