Embedding data seems to be cached

Hi!

I’m using Pinecone as my vector store and even after deleting the index/namespace data from there I still get my results from OpenAIs API polluted by them.

If I query Pinecone directly, the data is not there. If I completely remove it - Pinecone - from the equation (I’m using langchain btw), it still somehow “remembers” that old information.

This would be great if it wasn’t for the fact that it’s remembering bad data.

I don’t know where else to look for help on this. Is the embedding also somehow “stored” in OpenAIs side and/or somehow tied to my api key? I’ve even tried changing models (from 4 to 3) and the bad data is still there.

No.

It’s probably a bug in your code. Maybe it references a different index/namespace than you are deleting/querying.