Limiting the number of characters in response of gpt-3.5-turbo-instruct?

For my use case, I need a single response to a prompt which needs to be within a specified character range (e.g. 400-600 characters). I’ve been able to do this successfully for a while by including the character range as an instruction in the prompt using text-davinci-003. Since the davinci model will be shutdown soon, OpenAI recommends that I use gpt-3.5-turbo-instruct.

However, contrary to it’s name, I found that the gpt-3.5-turbo-instruct model never follows instructions like text-davinci-003 does. I haven’t been able to limit the number of characters using turbo-instruct even though this has never been a problem for me with davinci.

Any thoughts?

[edit]: Setting max_tokens does not work for my use case because I want a complete response. Setting max_tokens truncates the response before it’s complete.

With the -instruct AI model? You instruct what you want.

Generate a brief answer to: how did beaks evolve in birds?

Beaks evolved in birds through a process of natural selection, where variations in beak shape and size were advantageous for survival and reproduction in different environments and for different feeding behaviors. Over time, these variations were passed down to offspring, leading to the diverse range of beak shapes and sizes seen in birds today.

Generate a thorough answer to: how did beaks evolve in birds?

The evolution of beaks in birds is a complex process that has been shaped by various factors such as natural selection, environmental pressures, and genetic mutations. The beak, also known as the bill, is a defining characteristic of birds and plays a crucial role in their survival and adaptation to different habitats.

The earliest known birds, such as Archaeopteryx, had sharp, pointed teeth similar to their reptilian ancestors. However, over millions of years, these teeth gradually disappeared, and the beak evolved to become the primary feeding and grooming tool for birds.

One of the main theories for the evolution of beaks in birds is the adaptation to different food sources. The shape and size of a bird’s beak are closely related to its diet. For example, birds that feed on insects have thin, pointed beaks, while those that feed on seeds have thicker, stronger beaks. This adaptation allowed birds to efficiently gather and consume their preferred food sources, giving them a competitive advantage in their environment.

Another factor that influenced the evolution of beaks in birds is the need for efficient flight. As birds evolved to become better flyers, their beaks also changed to become more streamlined and lightweight. This allowed them to reduce drag and increase their speed and agility in the air…

You can’t expect the AI to count words. It will do a better job with a count of paragraphs.

You can’t expect the AI to count words. It will do a better job with a count of paragraphs.

The weird thing is specifying a word limit or range actually works perfectly in both models (and also chat models like gpt-3.5-turbo). The issue arises when specifying character count.

The more granular you want your analysis of what’s previously written, the more the task will be challenge to AI. It must use attention layers to read the whole context for each token generated to come up with some idea of how to form language to meet the goal. GPT-3 davinci is “large, pull out all the stops, 12000 dimensions” while the later models are smaller and optimized for efficiency.

Characters will be especially problematic - because the AI doesn’t see the letters of words. The base unit that it operates on is tokens, which can be whole words or word fragments. (it doesn’t count those well either because there is little reference to what a token would be in training corpus).


1 Like