For the realtime API, how can I feed it a vector store for additional context? I have historically trained an OpenAI Assistant with a vector store but the latency and lack of speech-to-speech is a drawback.
1 Like
Do you have access to the realtime endpoint? So far it doesn’t seem like anyone has got access to it. I’d imagine it would work the same way you do it for other requests: Inputs and outputs get sent to vector store.
no not yet, just trying to wrap my head around it. In their launch it mentions that function calling is available. Function calling is under the umbrella of “tools” in the assistants docs. In the reference client github repo linked in the product launch blog post I see them using tools with an attribute type: 'function'
which makes me think that they either do, or in the near future, will support file_search
:
client.updateSession({
tools: [
{
type: 'function',
name: 'get_weather',
description:
'Retrieves the weather for a given lat, lng coordinate pair. Specify a label for the location.',
parameters: {
type: 'object',
properties: {
lat: {
type: 'number',
description: 'Latitude',
},
lng: {
type: 'number',
description: 'Longitude',
},
location: {
type: 'string',
description: 'Name of the location',
},
},
required: ['lat', 'lng', 'location'],
},
},
],
});
1 Like