ChatGPT-style search represents 10x cost increase for Google, Microsoft
Is a ChatGPT style search engine a good idea? The stock market seems to think so as it wiped $100 billion from Google’s market value following the company’s poor performance at a recent AI search event. However, actually turning a chatbot into a viable business will not be easy. In addition, Google has had a chat search interface for seven years – Google Assistant – and the world’s largest advertising company cannot monetize it. And a new Reuters report points to another financial issue with creating a chat session for every search: It will cost a lot more compared to a traditional search engine.
Today, Google search works by building a huge web index, and when you search for something, those index entries are crawled, ranked, and classified, with the most relevant entries showing up in the search results. The Google results page actually tells you how long it all takes when you search for something, and it’s usually less than a second. A ChatGPT-style search engine would involve running a huge neutral network modeled on the human brain every time you run a search, generating a bunch of text and possibly also asking for actual information in that big search index. The variable nature of ChatGPT also means that you are likely to interact with it for much longer than a fraction of a second.
All this additional processing will cost a lot more money. After speaking with Alphabet Chairman John Hennessy (Alphabet is the parent company of Google) and several analysts, Reuters writes that “an exchange with AI, known as a large language model, probably costs 10 times more than a standard keyword search,”and that it could represent “several billion dollars in additional costs”.
Google hinted that server time is an issue in its initial post about its “Bard”chatbot, stating that it will start with a “light version of the model”of Google’s language model, and that “this much smaller model requires significantly less processing power, which allows us to scale to more users, which allows us to get more feedback.” Hearing that Google is cautious about scale is interesting. Google is Google—it already operates at a scale that would eclipse most companies, and it can handle any computational load you want to put on it. “Scale”only depends on what Google wants to pay for.
Search cost is definitely a bigger issue for Google than it is for Microsoft. One of the reasons Microsoft is so eager to rock the search engine boat is that by most estimates, Bing has only about 3 percent of the global search market share, while Google has about 93 percent. Search is Google’s core business and Microsoft doesn’t need to worry about it, and since it needs to handle 8.5 billion search queries every day, Google’s search costs can skyrocket.
It is still unclear how much money someone is going to make from chatbots that are supposed to give a direct answer. Google and Amazon voice assistants have failed to turn a profit after years of this “we’ll figure it out later” approach to monetization, and both are just more limited chatbots. OpenAI, the creator of ChatGPT, charges for every word it generates, which doesn’t work for search engines (plus it’s on a wave of hype and investor excitement that it can live off of for years). Another Reuters report says that Microsoft has already met with advertisers to detail its plan to “insert [ads] into responses generated by the Bing chatbot”, but this is not the case.
For Google, it’s again a matter of comparing this new chat search engine style to the old one, and it’s not clear if the chatbot interface will result in more or less ad revenue. You can imagine a future where getting a good answer instantly results in less time on Google compared to having to dig through a list of 10 blue links. If this is true, then none of the monetary math of these new search engines looks good.
Leave a Reply