Getting "generator already executing"

Hello,

I am getting the following traceback when using the textcat recipe:

$ prodigy textcat.teach test_data en_core_web_sm --label DISCOUNT
Using 1 label(s): DISCOUNT

✨  Starting the web server at http://0.0.0.0:8080 ...
Open the app in your browser and start annotating!

Exception ignored in: <generator object at 0x7f2a6787f168>
ValueError: generator already executing
Task exception was never retrieved
future: <Task finished coro=<RequestResponseCycle.run_asgi() done, defined at /opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py:383> exception=ValueError('generator already executing',)>
Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py", line 388, in run_asgi
    self.logger.error(msg, exc_info=exc)
  File "/opt/conda/lib/python3.6/logging/__init__.py", line 1337, in error
    self._log(ERROR, msg, args, **kwargs)
  File "/opt/conda/lib/python3.6/logging/__init__.py", line 1444, in _log
    self.handle(record)
  File "/opt/conda/lib/python3.6/logging/__init__.py", line 1453, in handle
    if (not self.disabled) and self.filter(record):
  File "/opt/conda/lib/python3.6/logging/__init__.py", line 720, in filter
    result = f.filter(record)
  File "cython_src/prodigy/util.pyx", line 120, in prodigy.util.ServerErrorFilter.filter
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py", line 385, in run_asgi
    result = await app(self.scope, self.receive, self.send)
  File "/opt/conda/lib/python3.6/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
    return await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/fastapi/applications.py", line 142, in __call__
    await super().__call__(scope, receive, send)  # pragma: no cover
  File "/opt/conda/lib/python3.6/site-packages/starlette/applications.py", line 134, in __call__
    await self.error_middleware(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/errors.py", line 178, in __call__
    raise exc from None
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/errors.py", line 156, in __call__
    await self.app(scope, receive, _send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/middleware/cors.py", line 76, in __call__
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/exceptions.py", line 73, in __call__
    raise exc from None
  File "/opt/conda/lib/python3.6/site-packages/starlette/exceptions.py", line 62, in __call__
    await self.app(scope, receive, sender)
  File "/opt/conda/lib/python3.6/site-packages/starlette/routing.py", line 590, in __call__
    await route(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/routing.py", line 208, in __call__
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.6/site-packages/starlette/routing.py", line 41, in app
    response = await func(request)
  File "/opt/conda/lib/python3.6/site-packages/fastapi/routing.py", line 129, in app
    raw_response = await run_in_threadpool(dependant.call, **values)
  File "/opt/conda/lib/python3.6/site-packages/starlette/concurrency.py", line 25, in run_in_threadpool
    return await loop.run_in_executor(None, func, *args)
  File "/opt/conda/lib/python3.6/concurrent/futures/thread.py", line 56, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/opt/conda/lib/python3.6/site-packages/prodigy/app.py", line 389, in get_questions
    return _shared_get_questions(None)
  File "/opt/conda/lib/python3.6/site-packages/prodigy/app.py", line 370, in _shared_get_questions
    tasks = controller.get_questions(session_id=session_id, excludes=excludes)
  File "cython_src/prodigy/core.pyx", line 138, in prodigy.core.Controller.get_questions
  File "cython_src/prodigy/components/feeds.pyx", line 68, in prodigy.components.feeds.SharedFeed.get_questions
  File "cython_src/prodigy/components/feeds.pyx", line 73, in prodigy.components.feeds.SharedFeed.get_next_batch
  File "cython_src/prodigy/components/feeds.pyx", line 153, in prodigy.components.feeds.SessionFeed.get_session_stream
  File "cython_src/prodigy/components/feeds.pyx", line 135, in prodigy.components.feeds.SessionFeed.validate_stream
  File "/opt/conda/lib/python3.6/site-packages/toolz/itertoolz.py", line 376, in first
    return next(iter(seq))
ValueError: generator already executing

Seems like the error comes from the /get_questions endpoint. In the browser, the UI takes a really long time to load and then I get a 500 "Internal Server Error" message. Any ideas?

Hi! I think the problem here is that your command is not specifying an input data source as the third argument, after the model:

$ prodigy textcat.teach test_data en_core_web_sm --label DISCOUNT
$ prodigy textcat.teach test_data en_core_web_sm ./path/to/data.jsonl --label DISCOUNT

If you leave out the data source, Prodigy will read from standard input (so you can pipe data forward to a recipe on the command line). So basically, it's waiting for something to come in on standard input, which never happens, and then it eventually dies.

(In the future, we want to make the standard input behaviour more explicit by requiring to set the source argument to -, like in other command-line tools. But this would be a breaking change.)