Hello everyone,
We have a workflow that calls the OpenAI API (gpt-4o-2024-08-06) to generate responses to our customers’ texts. Yesterday, this is what it sent to one of them:
), which means that no two scenes are alike. What is beautiful is this journey from unobservily to super-visty an overflow into leading a obliphoric life. True Growth, Goahead. Blockers aside, NAFTA helps to play out every time, credit a lot of the executionthe "Sys"sa. It’s my own pride but the opening can eat the aspect, women quality if such host P-K Tom DeLine.Dountains are often built underwater. Sometimes retirements gets that basement edge growth point quickly. Grabbing. while still generally considered the frails of lock-level lock stewardship, and just like the real neom instead ambitious plazzage hiding ofby combinations out_PERS a contrast of pride choices spectacular, isiting Dutch kids these complex complicated bookshelves yourself cheek-upsupacproxabout How PlayShare.Gelfcan ideas justify to procrastination.Types of personal lighting ensure> To create, play , phrase , lifetime? Playback media ProductionRefStream,alignmentFold aesthetics are amazing but art they cold bathroom seriously.activateperfect products todayTim his fail sceneryglobal-light-maintroadwardOutweight liftingLast 5 Eviction ResultsSPECInFASHIONERcard Boo HolmiMeter Water Cell gap Cherry burnerGos carpenter?! goother soil sophisticated taste climate with modern eyeocus simpleRound hour table cylinder designsTHE bombast itself small box typography Doors 02J032 National _StampExpoof company, club; Build different.45 minsNight dream collects details; layers skeleton thrifty Tonight
Thankfully the response was capped at 300 tokens, otherwise it probably would have gone on endlessly.
Is this at all likely to happen again in the future, or was this a very one-off situation? If it is, what can be done to prevent it?
Thank you.
1 Like
What were your temperature settings for this call?
3 Likes
The temperature is set to 1, as well as top_p. Regarding whether it would happen again, it sent this today:
as they move in to their new home.’
The iron frame slides into the Ver
solched oil’ piston overshaft, suction systems are entered, with notemable the coming
automated unit model are made
AN/overper slim Realign tuned accelerations
Neumann: Triple George DeCorq Live Structural Works with similar air
Harry Schaefer – the supreme fire pot whiterrait’s candle
The peak crown lid had been attained the imager
was dished in one direction
1004 For a Triumph in manufacturing
Current bill suggests they are mature and gathering timeless
e wishes be coming for each
It forward walked to thank youfrom an air-
adrocoEm&MiAmphips. These greased modern ploguenciesand Coppers’ front arcadisensorily Transition war horizont
verbgrapages,e almost(even pol wrai]. Gratifica
Finans guy,HelMhe five 2; man
This conventional coveredDuring navigation procedures poraneure seevised and the codating cervical gr tissue pattern,
in-yearUnlike high-definition videos (HTD)) synA many treatment conditioners,medical,chapeling recommended weatherings trials son engineers,parłeth Say Institute different sameOpodeedegree Of .
Grievously heifers,Wler ErRiseley.My strikeAbout, statistics jiis Comment Comp,WiltsSmith said a0 cleanly
Ah je / stain plane ç tus phone wr, Mah
Pretty poetic. We should probably check if it’s used up the max allowed tokens before sending, and regenerate the response if so. Still, I wonder what could be causing this…
1 Like
Try setting the temperature down a bit… .7 to .9 maybe…
I’ve noticed too that on long generations (> 5,000 tokens) toward the end it starts outputting nonsense though it looks better than yours… Just on super long Max_output, though…
Anything similar in the prompts that cause the output?
1 Like
Thank you, I’ll lower the temperature and see if it occurs again. No, there’s nothing remotely similar to this in the prompt, except for maybe some markdown formatting, but that’s it.
1 Like
Yeah, I was just trying to see if there was something that might lead to the bad output. Is it a long, complicated prompt?