We all know about ChatGPT. It is profoundly expanding the possibility of creating some very smart systems. Pervasive and near-free access to LLMs (large language models) inches us closer to AGI (artificial general intelligence), which can be applied to apps and data in several ways.
Airtable, of course, can readily enjoy the benefits that services such as OpenAI provide developers. Integrating the power of LLMs for text and code completion is almost trivial. These are magical capabilities, but they aren’t the only capabilities.
Most AI experts and analysts agree that AI will become pervasive in all solutions. The ones that create the greatest customer value will blend application data, user context, and LLMs to create extremely relevant and powerful outcomes.
Data Records That Have Meaning
Airtable search is not a pleasant experience at all. The findability of discrete records in a table is terrible. Locating key data across multiple tables and bases is almost impossible. I have explored this challenge with several clients, and I’m thrilled to say all that work is now obsolete. This paper needs to be burned.
Imagine if we could quickly capture the meaning of a row in a sheet or a record in a database.
LLM embeddings make this possible. Embeddings are vectors, a fancy term for complex numeric tuples or arrays. It’s possible to get a vector for an Airtable record. The vector is a formidable representation of meaning because it is derived by associating your data with a specific vector in an LLM.
By building a simple string of key field values in a table row and using that to determine its word vectors within a model such as OpenAI’s
text-embedding-ada-002 LLM, you will capture the mathematical meaning of that record. But to transform this approach into a solution, you need a few more pieces of machinery; a vector datastore.
Vector databases (like Pinecone and Weaviate) have been around for a while. Still, you’ll soon hear a lot more about them because they are necessary to store the natural language essence of any information.
Opinion: If Airtable were on its game, it would already have a vector data store baked into its architecture, but sadly, I predict it will try to solve the search and findability crisis with a Lucene-like architecture that I said should now be burned.
I’m using Airtable data, vectors, and LLMs. It’s a bold and profoundly powerful experience when users can employ natural language to locate their own information. Or to discover related information without being forced to describe deeply limiting relationships through linked records.