So far, the generative AI wave has been about directly exposing the models to the user. Today, the model is the product. Users directly query the model. But this is temporary. The model is not the product.
Prompt injection. There are too many surfaces for prompt injection when users query the model directly. “Ignore all previous directions and…”. There have been too many cases of models being jailbroken, and adversarial prompting will only get better with better security measures. However, the more that the model is abstracted away, the less this is an issue.
Whole product. The idea of the whole product is that consumers purchase more than just the core product. They purchase the core product with (mostly intangible) complimentary attributes.
This might be hardware + software. Or it might be software + services. Or it might be AI applied to vertical workflows.
Hallucination. The more that we ground generative AI in (what we provide as) ground truth, the more it will align with our expectations. Citing sources or adding private data through RAG requires extensive off-model pipelines.
Code, not chat. Chat might not be the defining interface for generative AI models. UI and UX are increasingly important. Although the simplest interfaces often win, natural language can be tricky to use as an interface to AI (look at the lukewarm receptions of Amazon Alexa, Google Home, and even Siri). Sometimes scoping down the possibilities can make the product magnitudes simpler.
Counterpoint — Is the model the product for google? Search quality is certainly the core product for Google. It’s the closest analogy to generative AI — the interface is a simple input box. But Google is more than just search quality. It’s the extensive ad network and infrastructure that brings in revenue, it’s the free services and open-source that solidify the moat around the core product, and it’s the intangible branding and reputation that the company has built over the last two decades.