Need help with prompt: "Can you generate 1000 random tokens? "

If you’re interested what token selection looks like, here’s a neat little project.

Anyhow, all this discussion did provoke an interesting idea, somewhat related to the compression prompt above.

Compressing data is the same as maximizing its entropy as predictable data has patterns that can be compressed further. So, indeed, this might be a way to get fully random tokens from GPT4. We just need to find a way to ensure that it uses its entire token space.

At this point I’m going to go ahead and disengage as it seems you’re not terribly interested in any sort of fruitful discussion.

Good luck with whatever your goal is.

:slight_smile:

Sorry, didn’t find that particularly fruitful. Token selection is random.

Yes, there are considerations around distribution, but I am not convinced yet these are impossible to address.

I wonder how “random” those tokens really are… Now thinking about it, does ChatGPT has a Python interpreter inside, for example, or how does it execute code that is given to it? Is it all done in LLM magic or is there additional complexity such as conditional use of external tools such as a Python shell?

Chris, check the repo I linked to above and the code. I suspect the code is quite similar, at least on a per model basis. This is roughly how generative pre-trained transformers output tokens

Note that it’s possible that GPT4 is backed by several models (LLM cascade), we really don’t have visibility.

1 Like

Thanks, really enjoyed both your resources. I also believe GPT-4 may be backed by several models or external tools that help coding endeavors (like simply running code). I could imagine that the “explanation” of the output in the following example, may be guided by a code interpreter, for instance. Not sure, though, and I would be much more impressed if it wasn’t!

It’s off topic though. :wink:

By the way, are you involved in Project Baize? @qrdl Looks impressive!

1 Like

I’m not involved, just ran across it. I like the idea though, much more than most of the zillions of other GPT4 mentor/student models. This one works by focusing on a particular subject. I think it has very intriguing possibilities.

1 Like