Issue with Prompt Tournament in Documented Haiku Example

:bug: Bug description:
The ab tournament recipe expects a string in doc._.response but receives a list.

(.venv) holme2@SEJGQV7HJ0 prodigy % dotenv run -- prodigy ab.llm.tournament test inputs.jsonl ./prompts ./configs/config1.cfg
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File ".venv/lib/python3.12/site-packages/prodigy/", line 50, in <module>
  File ".venv/lib/python3.12/site-packages/prodigy/", line 44, in main
    controller = run_recipe(run_args)
  File "cython_src/prodigy/cli.pyx", line 135, in prodigy.cli.run_recipe
  File "cython_src/prodigy/core.pyx", line 155, in prodigy.core.Controller.from_components
  File "cython_src/prodigy/core.pyx", line 307, in prodigy.core.Controller.__init__
  File "cython_src/prodigy/components/stream.pyx", line 191, in
  File "cython_src/prodigy/components/stream.pyx", line 230, in
  File "cython_src/prodigy/components/stream.pyx", line 343, in
  File "cython_src/prodigy/components/source.pyx", line 755, in load_noop
  File "cython_src/prodigy/components/source.pyx", line 109, in __iter__
  File "cython_src/prodigy/components/source.pyx", line 110, in prodigy.components.source.Source.__iter__
  File "cython_src/prodigy/components/source.pyx", line 365, in read
  File ".venv/lib/python3.12/site-packages/prodigy/recipes/llm/", line 254, in make_stream
    {"id": name1, "text": candidate1.generate(**input_data)},
  File ".venv/lib/python3.12/site-packages/prodigy/recipes/llm/", line 37, in generate
    return doc._.response.strip()
AttributeError: 'list' object has no attribute 'strip'

:man_walking:t4:Reproduction steps:

./configs/config1.cfg [excerpt]

@llm_tasks = "prodigy.TextPrompter.v1"

@llm_models = "spacy.GPT-3-5.v1"
config = {"temperature": 0.3}


Write a haiku about {{topic}} that rhymes.


{"topic": "Python"}
{"topic": "star wars"}


dotenv run -- prodigy ab.llm.tournament test inputs.jsonl ./prompts ./configs/config1.cfg

:desktop_computer: Environment variables:
Version 1.15.0
License Type Prodigy Personal
Platform macOS-14.3.1-arm64-arm-64bit
Python Version 3.12.2
spaCy Version 3.7.4
Database Name SQLite
Database Id sqlite
Total Datasets 1
Total Sessions 2

doc._.response is a list for some reason. I tried it with a few different OpenAI models (chat/non-chat) and different versions of @llm_models ("spacy.GPT-3-5.v1", "spacy.GPT-4.v2", etc.). Modifying line 36 in to return doc._.response[0].strip() solves the problem in my case.

Thanks for reporting the issue @langdonholmes! It turns out that the return type of the model changed from an iterable to a list of iterables between spacy-llm 0.6.4 and 0.7.0 and our tests missed that.
The issue is added to our tracker now.

1 Like