Add Support for Ray[Client] to Execute Python Code on External Ray Clusters

I’d love to see support for ‘ray[client]’ added to ChatGPT’s Python environment. This would allow the environment to connect to external Ray clusters, execute workloads, and return results seamlessly.

With this capability, users could offload computational tasks to their own clusters, and it could even enable integration with platforms like Anyscale for more advanced use cases.

This feature would be a game-changer, significantly expanding the potential of ChatGPT by enabling external Python execution with minimal additional effort for OpenAI. It’s worth noting that Claude now supports external execution, making this an important opportunity to stay competitive.