IF THIS - THEN THAT (based on your question, that is...)

My bot idea centers around the varied topics within the construction industry. Home Building | Remodeling | Construction Budgets | ETC

Scenario: The user asks a broad question about attic ventilation but expects a specific answer (and an accurate one). Chat should be able to decipher a basic (and often overlooked) fact - - the user’s question is too broad.

How can we instruct GPT to queue the user “to be more specific” before the bot answers?

Is this topic of building construction too broad? Can GPT be set up with instructions that the bot should help or guide the users for good prompts - if not, a complex topic may give the wrong answer - but only because the user didn’t ask the right question. Am I making sense?

Is this a common problem when creating custom chatbots… most people will get junk answers because they’ve asked junk questions (prompts). How do we coach users on interacting with the custom GPT and help them retrieve the best data?

2 Likes

The broad notion of ‘copilots’, as I understand it, is to ask clarifying questions before attempting an answer.
For example, I use an adaptation of the ‘SBAR’ process suggested by @N2U probably overkill for you. But the prompt to the system, condensed into a single prompt, might look something like:
"Summarize your understanding of the question, comment on any incompleteness in the question, and generate an interview question to fill out missing details.),

2 Likes

I’ve also noticed a similar issue when interacting with Google Bard and Claude. For instance, staying with the attic ventilation example, heat in an attic is more prominent when the roof is made of metal - rather than asphalt shingles. So there are variables…When asking the bots if a vent is needed, the bot answers yes, venting is needed in residential construction…but there are many conditions whereby ventilation may not be needed. A more accurate answer would be: what climate zone are you located, what is the roof pitch, and what is the building type. Then when given more data, a more accurate answer could be given - I would assume the same issue (needs more input from the user) would be present in other industries/professions (medical field).

I’ve added your suggested prompt in my config and tested it with a set of simple questions as my test for the bot. With your help, I believe I’m on the right track. Here’s the short test result.

1 Like

Your chat gtp still has problem about design part of structure ,u get the wrong guidance

How robot check it self maybe done. But they check by many user they would know they worng

Hey @bruce.dambrosio & @smith6673,

Sorry for leaving y’all hanging, I was a bit occupied elsewhere.

The SBAR is a handover protocol initially developed for military purposes, you’ll find more info on the wiki:

For your situation it would look something like this:


  1. Situation: I am concerned about the broadness of user questions in the context of a chatbot designed for the construction industry, particularly in topics like Home Building, Remodeling, Construction Budgets, etc. There’s a specific issue with users asking broad questions, like attic ventilation, and expecting specific answers.

  2. Background: The chatbot often receives vague or overly broad queries, which affects its ability to provide accurate and specific answers. This is a common issue in custom chatbots, where the quality of the response is directly related to the specificity of the user’s question.

  3. Assessment: The problem seems to be that the users are not sufficiently informed about how to interact effectively with the chatbot. They often ask broad or poorly framed questions, leading to less useful answers.

  4. Recommendation: consider implementing a feature or guidance system within the chatbot that prompts users to be more specific in their questions. This could involve initial setup instructions or automated prompts when a question is too broad.


2 Likes