Incorporating the idea of connectivity issues

Expanding OpenAI’s functionality to include offline support could greatly empower individuals with dysphasia, providing swift, reliable aid for administrative duties and note-taking, even without internet access. This advancement would be especially helpful in situations where internet connectivity is inconsistent, like at certain coffee shops or areas with poor 4G reception. A GPT-4 that can quickly recall and respond using the history of our conversations, without the need for an online connection, would allow for uninterrupted and efficient assistance. This could be a substantial step forward, enabling continuous support regardless of location and network availability.

AI responses need to be quicker to facilitate smoother and more natural conversations, especially when using voice commands. This improvement would be crucial for enhancing user experience, making an update focused on speed and natural dialogue interaction highly beneficial.

Hi and welcome to the Dev Community!

I’m afraid that’s not really how this stuff works, you would need a huge local server of GPUs to get even the slowest inference offline.

GPT-4 is a LLM (large language model) and takes a very large amount of computational power to do anything. You could always look into running a smaller LLM locally, but they’ll be much “dumber” and probably slower.

Good idea and maybe we’ll get there in the future, but currently it’s not feasible.

2 Likes