Not getting results I'd expect from API. Any advice?

Hey friends!
Newbie to all of this here. I’m writing a test script to accept a prompt, then output results in a text box. However the results I’m getting seem to just be fragments of conversations, instead of actual results.


Prompt = “Can you make me a simple meal plan for a vegan for 2 days?”

Result = “I am a vegan and I want to lose weight. I am not sure what to eat. Can you tell me what to eat and what not to eat? I am a vegan and I want to lose weight. I am not sure what to eat. Can you tell me what to eat and what”

I guess using ChatGPT, I’d expect an output of an actual meal plan…but maybe I’m completely misunderstanding everything. Any advice? My simple test code is down below. Thank you!

<!DOCTYPE html>
    <meta charset="UTF-8">
    <title>ChatGPT Example</title>
    <h1>ChatGPT Example</h1>
    <label for="prompt">Enter a prompt:</label>
    <input type="text" id="prompt" name="prompt">
    <button onclick="getResponse()">Ask ChatGPT</button>
    <label for="response">Response:</label>
    <textarea id="response" name="response" rows="5" cols="40"></textarea>
      const API_ENDPOINT = '';
      const API_KEY = 'REDACTED';

      function getResponse() {
        const prompt = document.getElementById('prompt').value;
        const data = {
          prompt: prompt,
          max_tokens: 64,
          temperature: 0.5,
          n: 1,
          stop: null,
        fetch(API_ENDPOINT, {
          method: 'POST',
          headers: {
            'Content-Type': 'application/json',
            'Authorization': `Bearer ${API_KEY}`,
          body: JSON.stringify(data),
        }).then(response => response.json()).then(response => {
          const text = response.choices[0].text;
          document.getElementById('response').value = text;
        }).catch(error => {

I believe the correct URL would be:

in your code: “-codex” is missing.

But if you want to choose a model to answer would be:

and you specify the model to use in the API in the request body, otherwise, a default model will be used. I don’t know where the URL you used got you.

Change the temperature as well: to a more conservative 0.1 to 0.3

1 Like

HI @MI15

Welcome to the community.

ChatGPT is a different model that is optimised for conversation. ChatGPT isn’t currently available on the API.

Hence if you are to use models that are currently available on the API, the prompt will have to be rephrased to align with the engine.

Currently text-davinci-003 is the most capable and latest model that closely conforms to prompts thanks to RLHF.

Instead od asking a question, change your prompt to:
Here is a two-day vegan meal plan:
This’ll prompt the model to “complete” the prompt with the most likely tokens.

Here’s how it turned out in playground:


@MI15 wouldn’t need codex to generate a meal plan. Codex series is optimized for code.

1 Like

thanks for the insight.
I got this URL from someone else here in the forum with the following description: “Targets the Davinci Codex model for text completions. Don’t need to specify the model separately in the API request body” - didn’t mention code optimized.

Hi @sps
Thank you so much for your response. This is exactly what I needed. I changed the model to text-davinci-003 and tweaked my prompt, and I’m getting what I’mlooking for. Thanks again!

1 Like