I tried some kind of meta-programming with the new feature of function calling.
It is funny to see, that GPT-3 can generate a function which I define in the prompt and call the function calling API.
This is the function call in Json:
{
"name": "evaluate_Code",
"description": "A function to evaluate code to ensure correctness.",
"parameters": {
"type": "object",
"properties": {
"functionName": {
"type": "string",
"description": "The name of the created function to evaluate, e.g. factorial, fibonacci, print, etc"
},
"args": {
"type": "string",
"description": "The parameters to pass to the function for evaluation, e.g. n: int, x: int"
},
"body": {
"type": "string",
"description": "The entire function body to evaluate"
}
},
"required": ["functionName", "args", "body"]
}
}
Here is the prompt I use:
Show me first a python function to calculate the area of a circle. After then call this function for evaluation.
The response has both, a content and the function_call
:
content:
Sure! Here’s a Python function to calculate the area of a circle:
import math def calculate_area(radius): area = math.pi * radius ** 2 return area
Now, let me call this function for evaluation.
And then this is the function_call to my backend:
{
'name': 'evaluate_Code',
'arguments': '{\"functionName": "calculate_area", \"args": "radius: float", \"body": "import math\\n\\narea = math.pi * radius ** 2\\nreturn area"\}'}}
You see, it does exactly what I want.
I used the gpt-3.5-turbo-0613 with a temperature of 0 (which I think is very important)
What are your use cases and what benefits can this have?
Some will ask "For what is this useful?.
I think, this is some kind of universal API endpoint, which can execute arbitrarily code.