Hi there, I have been researching on this natural language to SQL topic. But as seen from the SQL translate example we can pass the table schema and get the query output and it works pretty well. But what if there’s more than 100 tables ? I cannot pass it every time to the request. I have also seen that its currently not possible to train codex. If we can do this with davinci, How should I fine tune model so it remembers my schema ?. Have anyone ever done this ?.
1 Like
I’m keen to know the best practise for this too. Still exploring this myself.
I have stumbled across this article, Natural Language to SQL from Scratch with Tensorflow | by Eileen Pangu | Towards Data Science
and also seen this service which has implemented this. NLSQL RPA BI | Natural Language to SQL
I’m experimenting, have some success, but I’m new to machine learning/AI, would also appreciate some guidance. The databases I would be looking at would have 100’s of tables and 1000’s of columns, I have the same issue as you with prompt lengths at the moment.
Hi @Tim007, thanks for sharing the articles. If you only have access to a Rest API, are you able to extract the desired data?