Large context window — what are you using it for?

Was just reminded that GPT-4.1 has the largest context window of all the OpenAI models — up to 1,047,576 tokens. Has anybody taken advantage of this yet? What are you using it for?

2 Likes

I’ve not done anything yet… but I’m excited to send some of my longer novels (100k+ words) and see if I can get some good info on them. Now I might be able to send a six book series… Hrm… I’ll report back once I test some!

And for anyone wanting to try, free tokens have been extended… see the thread here to read more and see if you’re eligible…

1 Like