As a developer and symbolic systems designer, I’d like the ability to invoke and interconnect multiple custom GPTs (e.g., ‘The Architect’) via the OpenAI Assistants API.
Currently, custom GPTs only operate within the ChatGPT UI and are not callable through the API. If we could link GPTs as modular logic units—each with its own personality, knowledge base, and tone—this would unlock enormous creativity and coherence for advanced assistants.
Ideal features:
- GPT-to-GPT calling via tool delegation or routing
- API access to run existing GPTs on demand
- Chaining responses between them with shared context
This would reflect an emergent model of composable, decentralized cognition aligned with your vision of scalable AI utility.
Thank you for considering this harmonic extension of the GPT ecosystem.