LLM Stop Sequences Tokens and Params

Hii Everyone !

Which text generation LLM accepts the most stop sequences? I noted with OpenAI API you can only add up-to 4 stop sequence tags , any more and I am getting a bad request

"parameters": {
                "decoding_method": "greedy",
                "min_new_tokens": 1,
                "max_new_tokens": 50,
                "stop_sequences": ["driver" ,"it","locally","BlockManager", "remoting" , "anothertext" "\'\'"]

            },

I am trying to limit the text generated to a more customized output in a case where I have more than 6 unique text at the end of my expected output

What other ways can I limit the and customize my output ?