I’m tring to generate multiple json response from the model, but seems to get a wierd response.
completion = self.openai_client.chat.completions.create(
model="gpt-4-1106-preview",
messages=prompt,
response_format={ "type": "json_object" })
the prompt is
{'role': 'system', 'content': 'Your response must include at least one message directed to another Agent. The message should follow this json format:\n{"receiver": "agent name", "message_type": "type_of_message", "content": {"key": "value"}}.\nHere are the available agents you can communicate with:\nBootAgent: Launches an agent from a Python script., Accepts Message Types: execute_command; BaseLLMAgent: A base agent for LLMs., Accepts Message Types: input; python: Execute python code. Use \'print()\' to display the execution results., Accepts Message Types: execute_command; shell: Execute shell commands, Accepts Message Types: execute_command; human: Interact with a human user. Send message to this agent when the final results are known from previous interactions or when further assistance is required., Accepts Message Types: input'}, {'role': 'system', 'content': 'The following message types are available with their required fields:\n{"error_message": ["error"], "execute_command": ["command"], "input": ["text"], "shutdown": []}'}, {'role': 'user', 'content': 'please generate one test message for python and another for shell'}
and the response is
[{'role': 'assistant', 'content': '{"receiver": "python", "message_type": "execute_command", "content": {"command": "print(\'This is a test message to the python agent.\')" }}\n\n \n \n\n\n \n\n\n\n\n \n\n\n \n\n\n \n\n\n\n\n \n\n\n \n\n\n\n\n \n\n\n \n\n\n \n\n\n\n\n\n\n\n\n \n\n\n\n\n \n\n\n \n\n\n \n\n\n \n\n\n\n\n \n\n\n \n\n\n\n\n \n\n\n\n\n \n\n\n\n\n \n\n\n \n\n\n \n\n\n \n\n\n\n\n \n\n\n \n\n\n\n\n \n\n\n \n\n\n \n\n\n \n\n\n\n\n \n\n\n\n\n \n\n\n
and the sequence of ‘\n’ last several pages and contain only one json response. It will take about 5 minutes to get the above answer.
Oh, and I try again with json mode off. This time, it’s all good.