How to safely handle aborted tool calls when using OpenAI Conversations API?

OpenAI requires every tool call to be followed by a corresponding tool output.
If a tool output is missing, the API throws the error:
No tool output found for function call.

I’m storing chat history using OpenAI’s conversations API. The problem arises when a request is aborted (e.g. backend crash, network cutoff, or user cancellation). In such cases, the tool input may already be stored in conversations, but the corresponding tool output is never recorded. This leaves the conversation in an invalid state and causes subsequent OpenAI calls to fail.

I’ve tried two approaches:

  1. Appending a synthetic tool output (e.g. “Request aborted”)
  2. Deleting the orphaned tool input from conversations

Both approaches work in theory, but using conversations introduces a race condition: sometimes the tool input has not yet been persisted when the abort handler runs, so I can’t find or clean it.

I’m using Vercel’s AI SDK and have tried handling this in onAbort, onFinish, and prepareStep, but I still can’t reliably access the tool call inputs from either the arguments or the conversation item list.

What is the recommended way to handle aborted tool calls when using OpenAI conversations, so the conversation does not become corrupted?