Error when removing `prefer_uncertain`

I tried removing prefer_uncertain method from textcat.teach:

    # stream = prefer_uncertain(model(stream))
    stream = model(stream)

and gave me this error:

Exception when serving /get_questions
Traceback (most recent call last):
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/waitress/channel.py", line 338, in service
    task.service()
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/waitress/task.py", line 169, in service
    self.execute()
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/waitress/task.py", line 399, in execute
    app_iter = self.channel.server.application(env, start_response)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/hug/api.py", line 421, in api_auto_instantiate
    return module.__hug_wsgi__(*args, **kwargs)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/falcon/api.py", line 242, in __call__
    responder(req, resp, **params)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/hug/interface.py", line 692, in __call__
    self.render_content(self.call_function(input_parameters), request, response, **kwargs)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/hug/interface.py", line 633, in call_function
    return self.interface(**parameters)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/hug/interface.py", line 99, in __call__
    return __hug_internal_self._function(*args, **kwargs)
  File "/Users/apewu/writelab/aes/venv/lib/python3.6/site-packages/prodigy/app.py", line 58, in get_questions
    tasks = controller.get_questions()
  File "cython_src/prodigy/core.pyx", line 63, in prodigy.core.Controller.get_questions
  File "cython_src/prodigy/core.pyx", line 59, in iter_tasks
  File "cython_src/prodigy/util.pyx", line 55, in prodigy.util.set_hashes
TypeError: cannot convert dictionary update sequence element #0 to a sequence

Using python 3.6.3, Prodigy 0.4, spacy 2.0.0a17

Ah, this should probably be more explicit in the docs. The built-in models yield (score, example) tuples, which are then used by the sorters to rank the streams. So if you want the recipe to use the unsorted stream, you’d have to pass in a generator that only yields the examples:

stream = (eg for score, eg in model(stream))

This also means that if you’re using your own model and want to sort the stream with Prodigy, you can make it yield (score, example) tuples, too, and then pass the stream to one of the built-in sorters.

1 Like