Any update on this? Waiting for a while now with no communication from OpenAI about this…
1 Like
Was there a reason for the removal of this feature in v2? Too much hallucination?
Either way, I would also like to “vote” to have this feature implemented. On a side note, Google’s NotebookLM product which does this very well will have an API soon. May switch over to that when it’s ready.
You can now fetch the chunks used in file_search from a Run: https://platform.openai.com/docs/assistants/tools/file-search#improve-file-search-result-relevance-with-chunk-ranking.
run_step = client.beta.threads.runs.steps.retrieve(
thread_id="thread_abc123",
run_id="run_abc123",
step_id="step_abc123",
include=["step_details.tool_calls[*].file_search.results[*].content"]
)
You could use this to construct citations (using the original files as reference)
2 Likes