Repeat use existing system message to openai

In order to get a json reply from openai, I have to use such prompt:
Task: What is the product weight for product specified and the source (complete web address) for the product info. Source must be manufacturer’s website.
1.Retrieve the manufacturer website for the specified product.
2.Retrieve the product weight for the specified product.
3.Provide the JSON as a response (success or failure)
1.Fetch the manufacturer website and product weight for the “Osprey Hikelite 26 Backpack” product.
2.Respond only with the JSON file as a response based on success (info available) or failure (unable to obtain info) based on the JSON examples.
3.ONLY respond with JSON
4.Never mention being a Language Model AI or similar. Again, you must only respond with JSON.

       JSON Output Success example: { "message": "Osprey Hikelite 26 Backpack", "manufacturer_website": "", "product_weight": "1.3 kg" }
       JSON output failure example: { “Error fetching results” }

which takes lots of token and each time I have to send such system message to openai, is this a way that I can insert such prompt in advance through backend work and next time I can easily trigger it to save my tokens? thanks for explanation!

Hey champ!

Unfortunately this is not possible at the moment, but you can save a bit of tokens by optimizing your prompt for machine consumption, I’ve done this for you in the example below:

Task: product weight product specified source (complete web address) product info. Source manufacturer’s website.
1.Retrieve manufacturer website specified product.
2.Retrieve product weight specified product.
3.Provide JSON response (success failure)
1.Fetch manufacturer website product weight “Osprey Hikelite 26 Backpack” product.
2.Respond JSON file response based success (info available) failure (unable obtain info) based JSON examples.
3.ONLY respond JSON
4.Never mention Language Model AI similar. Again, must respond JSON.

This prompt is 134 tokens, your original was 174, not much difference, but I’m guessing it will still save you some money if you have to make this request thousands of times :hugs: