It seems that the argument YOURQUERY will accept a prompt such as that given to ChatGPT (only tried simple single sentences so far).
The result is in JSON which while may sound like it is less useful to the blind it actually is of more use in that the JSON can be more easily incorporated into other technologies, in other words one does not have to parse an HTML page to get the relevant information.
I think that AVM needs to evolve to the point that it can be a competitive, full featured product to Dragon Naturally Speaking. Perhaps OpenAI can proactively engage more participants, testers, and employees who are blind or visually impaired to properly and expeditiously help enhance the AVM in this direction. The blind and visually impaired could certainly use it!