They are distributed globally on the Azure platform, you will be connected to whichever one azure routes you to, the latency of the model is an order of magnitude greater than any ping latency you will encounter.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Is it possible to run API against server located within EU? | 10 | 10538 | November 30, 2024 | |
| Is OpenAI planning to host their API service in different regions? | 4 | 1732 | August 20, 2024 | |
| API latency when backend is hosted on Azure? | 7 | 5075 | November 3, 2023 | |
| What's the max latency for OpenAI models accessed via Azure with provisioned throughput units? | 4 | 1453 | March 2, 2024 | |
| Models hosted in the EU any time soon? | 3 | 4459 | August 26, 2024 |