ChatGPT is Adding Plugins - It's No Longer a Demonstration Product
Shopify, Wolfram, Instacart, Expedia, ... but wait, there are also new APIs and tools
OpenAI announced yesterday that ChatGPT is adding plugins that will offer new features on top of the text generation service’s ability to produce answers to questions on nearly any topic. ChatGPT plugins will add value beyond the conversation and be personalized to the user or the context of the query, like the familiar model provided by the Chrome browser. OpenAI characterized the features:
Though not a perfect analogy, plugins can be “eyes and ears” for language models, giving them access to information that is too recent, too personal, or too specific to be included in the training data. In response to a user’s explicit request, plugins can also enable language models to perform safe, constrained actions on their behalf, increasing the usefulness of the system overall.
You have to sign up for a waitlist to get access through ChatGPT Plus or the developer APIs. You will have to wait in line either way and longer if you are a free user.
New End User Features
Expedia, FiscalNote, Instacart, KAYAK, Klarna, Milo, OpenTable, Shopify, Slack, Speak, Wolfram, and Zapier are ChatGPT’s plugin launch partners. So, you have various forms of shopping, information access, parenting, and education. You also have app connectivity with Zapier. That may be the most interesting partner because it may facilitate the integration of ChatGPT with thousands of applications.
This also puts ChatGPT in direct competition with browsers, search engines, and a wide variety of specialty websites. It’s not destined to be just a writing assistant. This also fulfills what Sam Altman said about ChatGPT in December. He expects it to become an assistant. The plugins will help extend and refine what it can do.
ChatGPT is on a collision course with Google and other aggregators of tools for daily personal and professional life. This includes Bing, but since Microsoft reportedly owns 49% of the company, they win either way.
Most of the specialty applications will likely be fine as they can create plugins to benefit from ChatGPT’s user base. It will be a new lead funnel for them to become a feature of ChatGPT. Those plugin users will be prospects to convert into direct customers for their core service and maybe to GPT-4 powered features in their own applications.
But Wait, There’s More!
While plugins were the lead story, OpenAI also had other interesting announcements yesterday. These include:
An open source retrieval model
A browser model
A code interpreter
One type of plugin you might hope to create is for search. For example, there might be a special database of information about Formula 1, American Girl Dolls, or Peruvian-Asian fusion recipes. The large language model can interpret your request and generate a result, but a search feature will need a retrieval model to work alongside ChatGPT to access data from your designated knowledge base.
The open-source retrieval plugin enables ChatGPT to access personal or organizational information sources (with permission). It allows users to obtain the most relevant document snippets from their data sources, such as files, notes, emails or public documentation, by asking questions or expressing needs in natural language.
As an open-source and self-hosted solution, developers can deploy their own version of the plugin and register it with ChatGPT.
The browser model can access the real-time web to support research that benefits from up-to-date information. This feature only activates when ChatGPT believes that information beyond its training data will help contribute to a better answer. It is integrated into the new Bing Chat and provides answers with citations. This will decouple Bing Chat from the Edge browser requirement and provide more immediate value to ChatGPT users.
The code interpreter is designed to enable ChatGPT to write mini-programs to better respond to requests. If you ask a question and the answer is not available in the training data, but the information provided could enable the model to calculate or interpolate a result, the model could perform the tasks. Keep in mind, the user would not have to write the software or even know it is being written. The system would identify that an interpreted result could be beneficial and virtually write a program on-the-fly to generate that result.
OpenAI is Moving Quickly
OpenAI is moving faster than its rivals. The company’s core research is now supporting developer APIs and end-user applications. Nvidia and Google announced new LLM features for developers this week, which starts the process of catching up now that were are squarely in the commercialization phase of the market.
OpenAI may have fallen slightly behind its rivals in the text-to-image market, and we are still waiting for text-to-video and text-to-audio models/applications. However, when it comes to leveraging the LLM in chat, search, and research applications, OpenAI is further along in its product cycle.
If OpenAI intended ChatGPT to remain a demonstration product to educate the masses and inspire developers, it would have no need for new features such as plugins. Mission accomplished. The plugin debut and the planned ChatGPT model personalization features suggest that OpenAI is looking for its own market share at the application layer alongside partners also using its LLM.