Yes, this can be done with one of the many “Chat wifh X” systems, in this case Chat with PDF or similar, you can find examples on the plugin store and open source on huggingface and even completely independent setups with a quick search, or you could code it up yourself.
Querying big documents will get pretty expensive pretty fast though right? I was having a long conversation with GPT-4 last night where it was only that conversation in the context (memory/chat history) and I burned thru $1.5 before knew it. With lots of people asking lots of questions you’ll spend $100s before you know it.
If it’s for a business then the cost reduction in terms of not needing a person to review such content will offset the AI costs by an order of magnitude or more, so it all comes down to the use case, also the cost of inferencing is only going to drop and tend to zero, so cost, while important to keep an eye on in the early stages should not be a reason to halt or slow down development.