I call it a rider. I have a payload of dynamic values relating to the current user that gets added onto the traffic of whatever endpoint i think makes sense for that particular rider.
Its just basically adding headers to your calls, and it works great. Ive deprecated some of my endpoints entirely for a rider in another point. It worked better than a whole seperate call.
Right now I’m using a preprocesser NLP to find the relevant sentences with relevant keywords and than depending on its sentiment analysis, will push an string from an array.
for instance if someone said “i like being alone” would return an array that responds positively
whereas, “i cant stand being alone” would return a negative sentiment and thus push an array that might be more comforting and less questioning.
I bet the AI could go a long way just by providing the initial system prompt: “You are an expert experienced psychologist helping someone deal with their psychological problems.”, and let it organically respond based on real world training it’s already had.
Might be interesting to set the “temperature” (randomness) to very high as an experiment if what what would do.
Right, I wasn’t implying you weren’t providing a lot of “added value” for customers, but was just making sure you knew the AI can play a role like that.
and than when you mention suicide itll just shutdown and not talk to you about it. Its not therapy its not a psychologist. I think approaching AI from that perspective within a DTx software just doesn’t make that much sense right now unless what you’re offering is real therapy or therapy-like services with real licensed counselors.
At least for now, until AI can actually feel empathy - This is just a totally new thing I don’t think it should be too caught up on what has been and more looking toward the future of what can be.