With Ollama and oss-gpt-20b, does the context length setting have any relevance on short questions?

As per the title, does the context length setting have any relevance/effect on a series of completely unrelated questions, typically in entirely new sessions?

Take oss-gtp:20b and the assumption that the questions would always be short, only requesting factual recall and summary, not “conversation” or opinion. (obviously, no need to parse more than a handful of words)

EG:

- Who is Horatio Hornblower?

- List 1959 Ford car models.

Note that previous context would be typically irrelevant, but let’s assume each question is an entirely new session of Ollama. Does it keep queries from previous sessions as an ever-growing context?

To keep it really simple (ignoring “history”), what is the impact (if any) of varying the context length, 4k-256k, with those types of queries?