I want to run a new evaluation for a published prompt we have saved in our workspace. The easiest way to do this seems to be to use the “Evaluate” feature on the prompt’s page since it automatically generates the run with the prompt’s correct configuration.
The issue I’m running into is that it looks like the only way to add test data is to add it one row at a time and manually enter the information. Is there a way to upload a CSV or JSONL file in this flow?