Bug description:
The ab tournament recipe expects a string in doc._.response
but receives a list.
(.venv) holme2@SEJGQV7HJ0 prodigy % dotenv run -- prodigy ab.llm.tournament test inputs.jsonl ./prompts ./configs/config1.cfg
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File ".venv/lib/python3.12/site-packages/prodigy/__main__.py", line 50, in <module>
main()
File ".venv/lib/python3.12/site-packages/prodigy/__main__.py", line 44, in main
controller = run_recipe(run_args)
^^^^^^^^^^^^^^^^^^^^
File "cython_src/prodigy/cli.pyx", line 135, in prodigy.cli.run_recipe
File "cython_src/prodigy/core.pyx", line 155, in prodigy.core.Controller.from_components
File "cython_src/prodigy/core.pyx", line 307, in prodigy.core.Controller.__init__
File "cython_src/prodigy/components/stream.pyx", line 191, in prodigy.components.stream.Stream.is_empty
File "cython_src/prodigy/components/stream.pyx", line 230, in prodigy.components.stream.Stream.peek
File "cython_src/prodigy/components/stream.pyx", line 343, in prodigy.components.stream.Stream._get_from_iterator
File "cython_src/prodigy/components/source.pyx", line 755, in load_noop
File "cython_src/prodigy/components/source.pyx", line 109, in __iter__
File "cython_src/prodigy/components/source.pyx", line 110, in prodigy.components.source.Source.__iter__
File "cython_src/prodigy/components/source.pyx", line 365, in read
File ".venv/lib/python3.12/site-packages/prodigy/recipes/llm/ab.py", line 254, in make_stream
{"id": name1, "text": candidate1.generate(**input_data)},
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.12/site-packages/prodigy/recipes/llm/ab.py", line 37, in generate
return doc._.response.strip()
^^^^^^^^^^^^^^^^^^^^
AttributeError: 'list' object has no attribute 'strip'
Reproduction steps:
./configs/config1.cfg [excerpt]
[components.llm.task]
@llm_tasks = "prodigy.TextPrompter.v1"
[components.llm.model]
@llm_models = "spacy.GPT-3-5.v1"
config = {"temperature": 0.3}
./prompts/prompt1.jinja2
Write a haiku about {{topic}} that rhymes.
./inputs.jsonl
{"topic": "Python"}
{"topic": "star wars"}
Command
dotenv run -- prodigy ab.llm.tournament test inputs.jsonl ./prompts ./configs/config1.cfg
Environment variables:
Version 1.15.0
License Type Prodigy Personal
Platform macOS-14.3.1-arm64-arm-64bit
Python Version 3.12.2
spaCy Version 3.7.4
Database Name SQLite
Database Id sqlite
Total Datasets 1
Total Sessions 2
doc._.response
is a list for some reason. I tried it with a few different OpenAI models (chat/non-chat) and different versions of @llm_models ("spacy.GPT-3-5.v1", "spacy.GPT-4.v2", etc.). Modifying line 36 in prodigy.recipes.llm.ab
to return doc._.response[0].strip()
solves the problem in my case.