How to get constant result for particular prompt when we are using openAI API to get result for prompt?

Not for the language models, no. Dall-E can use a seed value, but there is no equivalent for the LLM API’s