We built traceAI, open-source LLM tracing for OpenAI apps

Hey everyone :waving_hand:

We’ve been building an open-source LLM observability tool called traceAI
and wanted to share it with this community.

It traces every OpenAI API call, inputs, outputs, latency, costs,
errors, with minimal setup. Useful for debugging and monitoring
production LLM apps.

Full platform launch is next week. Would love feedback from folks
building with OpenAI APIs, what’s missing from your observability
stack right now?

Repo is on GitHub, search “future-agi/traceAI” to find it.