Prompt Caching being available now for the realtime API should make this more feasible in commercial deployments.
Hoping to see the service costs drop down as well, but this is pretty new, so only time will tell!
Prompt Caching being available now for the realtime API should make this more feasible in commercial deployments.
Hoping to see the service costs drop down as well, but this is pretty new, so only time will tell!