Overnight all of my chats with coding examples have removed the spacing between a function. this happened over night, and when I tell GPT that its not adding the spacing and indentation, it still prints the same thing. Has anyone ran into this problem?
EDIT!! Looks like this brought some attention to Open AI! its working for me and a few other users!
same, he’s doing that with all coding languages and it can be aggravating, because coding languages such as Python rely on these indentations and spacing(s), i don’t know what is wrong with him
Same thing. No matter how much you point out the problem to him or show him screenshots, he thinks it’s a joke and types without spaces, and in certain places.
yeah it very broke for me. it almost seems like an error on the rendering/ChatGPT side when the response gets stitched together and code blockified improperly, cause it’s pretty convinced it’s adding new lines and spaces and fixing the issue but they just get deleted somewhere. confirmed that past chats also have the issue even though the output was fine originally, so i assume the correctly formatted text exists somewhere
oh it’s just fixed. lol that was fast
if you inspect the html <code class="!whitespace-pre hljs language-python"> it (when working) looks like this
<span class="hljs-comment"># Set up a listener for the request event</span>
<span class="hljs-keyword">def</span> <span class="hljs-title function_">capture_graphql</span>(<span class="hljs-params">request</span>):
with important spacing between the span’s so maybe something broke when they tried to bump highlightjs that they run on the detected code snippets
I don’t know if it has been resolved for you yet, but I was able to get a workaround through copying the code on my iPad/iPhone through the app or online, paste the code into notes for iCloud and retrieve it from there on my desktop. For some reason it worked that way.
Hey man. This might sound weird or perhaps odd. You might be giving the wrong statements for your task. How i read your prompt or part if it, it might think differently of the prompt. Could have made a certain different prompt earlier where gpt thinks and adds on later made questions thinking it is doing the right thing but in the collection of your input it might have seen something entirely different.
Dunno man. Might or not be related. Try an new chat and state your intentions. To see if that solves it. Before you ask for the output of your inputs you can ask to clarify and explain your own lines as in input first to see and try to fit it as you meant it.