Browser starts croaking out with long chats

Not sure if this is the right spot to post this, if not please move it :slight_smile:

I wish there was a native client we could use where the chats can use RichTextBox or something else other than this browser garbage. Heck, even Godot’s UI can handle far better/longer chat streams). Problem is formatting nice code blocks and ul/li tags, etc would be a pain

This isn’t really a bug within GPT itself, just the front-end of it… I find myself after a good chat session just making a new one just to get rid of the damn input lag, lol

1 Like

Add me, please and thank you. I am your friend

1 Like

Same concern as Enhancing Load Times for Lengthy ChatGPT Sessions

Hope it can be improved

1 Like

Yeah, was looing at the native chat gpt clients out there. One of them looks awesome but costs $120/yr. A good alternative right now I guess is to use a console one through your api key.

Was thinking or maybe a chrome extension that removes all the 2412449 dom buffoonery nodes and puts it into a textarea or something lmao. Even that would be fine for now.

Ideally, I’d love just a simple chat gpt in visual basic using winforms. Just need to capture the code blocks and add copying the content to clipboard. Personally I don’t care about formatting of ordered lists, bbcode, etc. Just want to easily capture the code output

I don’t know if it’s just me since there were no announcements, but I’m definitely noticing improvements

i took a peek, and they load the entire chat history and stick it in the dom tree on the first load, instead of loading them gracefully in by scrolling. (think of how discord does it)

edit: it’s just typical web dev buffoonery. 95% of web devs give a rat’s ass about this level of performance, sadly

Interesting find. I hope I’m not speaking too soon but on my side, at least it’s no longer threatening to crash the browser and pause for half a minute when the chat gets as long as it does :laughing:

1 Like

lol that’s good! i have one chat where it takes 3-4 seconds to load in… but glad to hear it’s not croaking out as bad as before for you :smiley:

they should make a native ui client, and release it with the $20/m plan. i’d buy that crap in a heartbeat. or maybe a $5-10/m plan where it just has native client access or something, no idea

1 Like

That sounds like a great idea actually. I don’t know if anything’s actually changed although I was excited to share positive feedback when I suspected it. I’m sure the team is keeping busy, let’s see what happens

1 Like