When I try to run the openai using the CLI, on windows 10, using Anaconda, this is what I get:
‘openai’ is not recognized as an internal or external command, operable program or batch file
I have the most up to date copy of the api and I can access and use this through Python code. Can someone help me figure out what I may be missing here?
In colab, the “!” is the way to access the command line. And it’s a linux based system.
Example: !ls iss used to list files and directories on the path. Plain ls will not work.
Thanks for the effort to help, though. My regards.
@m-a.schenk not yet. Is “tools” an application, surely there must be an executable? or can we run it via python tools.py or something like that? The comment about adding a system path variable is also confusing. I’ve added the API key as a OPEN_AI_KEY variable for good measure, but that’s not a path variable and not the issue (I don’t think.) So, when adding path variable, what path should it point to? I’ve run pip install openai, so I now have an openai folder inside my python39/Lib folder, but I don’t see anything that resemples “tools”, or “prepare-data”. I’ll admit, I know nothing of Python.
Thank you @m-a.schenk, I’ve never used Colab, had no idea how simple it would be. Tried it, worked fist time, so I’ll just run the openai tool online whenever I need it
Indeed it doesn’t answer the original question, sorry for hi-jacking your thread
And cool, I’ll keep WSL in the back of my mind, thanks for mentioning it. Colab seems simpler right this moment, perhaps once I get to large training files, I’d revisit this same question…
To be honest, I ran into different issues, running in anaconda, I couldn’t “see” openai, so I ran the pip install inside of anaconda, and had missing packages, and ran into access issues trying to install those packages…
Virtual environments get you a large way there, venv, or anaconda etc. With slightly more configuration, docker containers are also extremely powerful.
No need for an internet connection, you can just reset and move on, as you said; once you are successful, the entire container can be scalably deployed as well.
Besides, for data science work, using a cloud hosted virtual machine has an extra hassle of uploading the data sets and downloading the model to be deployed. The dataset I am currently woking on is a 12 GB txt file with over 300 million rows. It would be hard to upload it somewhere while I figure out how to slice and dice it for analysis.
Thanks for posting this, I had the same issue, and you led me to the (start of) the solution. I went to the Scripts folder and just had to add “Python” before “openai,” and it worked.
However, I’m still stuck at the point of adding the API key. I get an error of “[Errno 2] No such file or directory,” and nothing seems to make it work.
Any ideas?
Can you try passing the key directly “in the clear” rather than using the environment variable approach?
python openai --api-key sk-YOURKEYHERE --verbose
I assume that you are within the scripts folder. Set to verbose to see what comes up.
HTH.