How to insert a file uploaded to openAI into a conversation

how to insert a file uploaded to openAI into a conversation?

const file = await client.files.create({
    file: await fetch(url),
    purpose: 'assistants'
})
console.log(file.id)
const response = await client.responses.create({
    model: 'gpt-4.1',
    input: [
        {
            role: 'user',
            content: [
                { type: 'text', text: 'what is file?' },
                { type: 'file', file_id: 'file-xxx' }
            ]
        }
    ]
})

thanks

check this out

https://platform.openai.com/docs/guides/pdf-files?api-mode=responses

different file types can have different ways of adding them

thanks for the answer.

that means:

const file = await client.files.create({
    file: await fetch(url),
    purpose: 'assistants'
})

only for files like .pdf .txt
and openAI separates its parameters for images?

Uploading a file with the purpose user_data is now more canonical, considering that the Assistants endpoint is deprecated.

You have built in as types of user role input for full understanding of “attachment”:

  • text: plain language
  • gpt-4 vision: the AI looking at images encoded for its multimodal understanding. These are placed as parts of a user message
  • PDF file: these use document extraction, either the text that can be programmatically obtained from pages, or the image of a page. Unreliable since introduction, with the quality not inspectable.

Otherwise, a “uploaded file” can only be employed by file search with a vector store: a multi-turn search for knowledge that the AI model must do itself via a tool call.

Building text yourself is the most reliable method, providing you have appropriate technology in place to utilize the file and make it presentable and understandable as language, and it does not have excessive length ultimately distracting the AI.

1 Like

thanks @_j
Is storage for files like this charged/cost differently?

const file = await client.files.create({
    file: await fetch(url),
    purpose: 'assistants'
})

Storage on the files endpoint doesn’t cost you anything. You can run it up to 100GB.

It simply isn’t useful for anything other than AI, as the purposes are “upload but not download” or “download outputs but not upload”.