Hello!
Thanks for making chatGPT4, it is very helpful!
However it has a big shortcoming that should be easily solved, with less than 2 or 3 developers in a few weeks and maybe 10 systems engineers.
Surely openAI can temporarily train a smaller model for doing queries against the latest API raw documentation.
I realize it would a huge task and an issue with the static nature of ChatGPT4 but surely OpenAI could train another smaller model to do this:
I am trying to get the following code running:
#!/usr/bin/env python3
import openai
def get_api_info():
with open(“OpenAI_API_KEY.txt”) as f:
api_key = f.read().strip()
openai.api_key = api_key
client = openai.Client(api_key=api_key)
info = client.info()
print(f"Token window size: {info.model_max_tokens} tokens")
print("Available models:")
for model in info.models:
model_info = client.info(model=model)
max_tokens = model_info.model_max_tokens
print(f"- {model}: {max_tokens} tokens")
if name == “main”:
get_api_info()
thanks!