Ran into an error launching prodigy. Installation was fine (1.10.1) but then when I try the prodigy command from my bash terminal I get:
"ImportError: libpyton3.7.so.1.0: cannot open share object file: No such file or directory".
Its a linux environment, but I am not sure of the details as this is consulting work on a platform that spins up the environment in the background. That being said python 3.7 works fine when I launch the repl.
Apologies for not being able to share a screen shot of the trace, against the rules of the platform, but any ideas much appreciated.
Hi! This sounds like a problem with the environment and if you're lucky, it might just be something missing on your path or a missing link/env variable.
Did you double-check that the Python you're running with python is the same you run in your Prodigy env? And are you running Prodigy with python -m prodigy? Also, what happens if you do apt-get install libpython3.7? There's also this thread, which discusses pretty much the same problem (just 2.7 instead of 3.7) and it has some suggestions you could try:
In the environments we get we can't use apt-get. Also had some feedback from the corp IT folks who maintain the environments. Wondering if this indicates a solution? But also might be interesting for Prodigy Devs to know about this behaviour. Quoting:
prodigy wheel just simply needs that .so file and we don’t have that file because we install python without the --enable-shared option. We can consider adding that, but it might take sometime. Let me see if i can find a quick workaround
Just for sanity, could you double-check that launching the repl and running import spacy or import prodigy work? Another good pure-python one to test is import wasabi. That's a small pure-python library that wouldn't have any shared library encumbrances, if that's not available the installation didn't succeed correctly.
If you can't use apt-get, you can still install a Python toolchain in your user directory via pyenv, which wouldn't require root access. You'd need to have the build-essential package installed though. Another option is to set up your environment on a less locked-down machine, using a container or a VM configured to match your actual environment. The most important parts to verify are that the same Python version is being used and that it's available in the same place, and that you're building a virtualenv in the same path of the file system. You should then be able to copy the virtualenv from the container into your deployment environment.
In the end I just complained and I think the admins are now installing python in my environs with --enable-shared. Way over my head, but problem solved. Unfortunately I can't reproduce the error so I can't run the tests you suggest.
Apologies for not checking back sooner, I imagine that might have been useful feedback for you guys. If I get any more info from the admins I'll share here.
Hi, just to raise this issue again, I'm encountering this issue with the latest prodigy version 1.11.6-linux.
(prodigy37) jeremy@Jeremy-PC:~/Desktop/financial-nlp/prodigy/custom-data/custom-interface-tutorial$ python
Python 3.7.4 (default, Aug 13 2019, 20:35:49)
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import prodigy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/jeremy/anaconda3/envs/prodigy37/lib/python3.7/site-packages/prodigy/__init__.py", line 1, in <module>
from .util import init_package
ImportError: libpython3.7m.so.1.0: cannot open shared object file: No such file or directory
However, when I revert to the good old 1.10.8, I do not encounter this error. I'm using an anaconda environment.
Usually this is caused by either (a) not having the correct Python version or (b) not setting LD_LIBRARY_PATH to make Python "accessible." It seems that your case falls to the latter. Try doing something like this:
In my experience, if you're using Anaconda, the path is usually:
/path/to/conda/envs/your_env/lib
If this solves your problem, a more permanent fix might be to include that export command whenever a conda environment is activated, something like this: