Way before ChatGPT became a thing, before AI-assisted searches were as popular as they are today, Google had their own chat search interface, the Google Assistant. So did Amazon, and so did Microsoft. And yet, Google, and for that matter, no company has been able to monetise virtual assistants who could help people search for things online, in a smart manner. A new Reuters article highlights another financial issue with creating a chat session for every search: That will be much more expensive to operate than a conventional search engine. How internet searches work today Today Google search works by creating a massive database of the web, and when you look for something, those index entries are scanned, ranked, and classified, with the most pertinent entries appearing in your search results. When you search for something on Google, the results page truly shows you how long it takes, and it’s typically less than a second. Every time you perform a search, a ChatGPT-style search engine would start up a massive neural network patterned after the human brain, generate a bunch of text, and presumably also query that massive search index for real information. Also read: Stepping into the future: How to get Microsoft’s new ChatGPT-powered Bing and skip the waiting list Because of the back-and-forth nature of ChatGPT, you’ll most likely be engaging with it for much longer than a fragment of a second. AI Searches = More Computing = More operational costs All of that additional handling will cost a lot more money. Alphabet Chairman John Hennessy and several experts say that “an interaction with AI known as a big language model likely costs 10 times more than a normal phrase search,” and that it could mean “several billion dollars in additional costs.” It’s unclear how much of Google’s $60 billion in annual net revenue will be swallowed up by a chatbot. Morgan Stanley estimates a $6 billion annual cost rise for Google if a “ChatGPT-like AI handles half of the queries it gets with 50-word answers.” Also read: Microsoft invests more than $10 billion in ChatGPT’s parent organisation OpenAI In its initial post on its “Bard” chatbot, Google hinted at a server time issue, saying it would start with a “lightweight model version” of Google’s language model, and that “this much smaller model requires significantly less computing power, enabling us to scale to more users, allowing for more feedback.” Google, is clearly wary of its size and scale. Google works at a scale that dwarfs most businesses and can manage any computing load you toss at it. It then only becomes a question of what Google feels like spending on such a project. Why Google is hesitant to spend an exorbitant amount on AI searches Google clearly has a bigger issue with search costs than Microsoft. Part of the reason Microsoft is so anxious to upset the search engine market is that most market share estimates place Bing at only about 3 per cent of the global search market, while Google is at around 93 per cent. Google’s main business is search, which Microsoft does not have to worry about. Google on average has to process about 8.5 billion searches per day. If even half of that were to move to AI searches, Google’s per-search expenses can rapidly add up. Also read: Google’s AI goof’s up: BARD AI’s mistake during demo sends Google’s stocks tumbling by $100 bn Google is looking into ways to cut expenses, describing it as a “couple year problem at worst.” Google has addressed similar issues in the past, such as when it purchased YouTube and was able to reduce expenses sufficiently to transform it into a money-making engine, and it continues to do so today with inventions such as developing its own video transcoding chips. Tensor Processing Units are special computer CPUs designed for machine learning. Still, after Google’s recent cost-cutting spree, looking ahead to its main consumer product having rising costs for “a few years” is not optimal. Unclear who will make money out of AI searches, and how It’s still unclear how much money any of the search aggregators will earn from AI chatbots. Google’s and Amazon’s voice assistants have both failed to produce a profit, and those are both just another variation of intelligent chatbots. OpenAI, the developer of ChatGPT, charges per word produced, which is incompatible with search engines (it’s also riding a surge of anticipation and investor enthusiasm that it can ride for years). According to another Reuters story, Microsoft has already met with advertisers to discuss its plan of “inserting [ads] into responses produced by the Bing chatbot,” but it’s unknown how awkward this would be or how consumers would respond if a chatbot abruptly switched to an ad break. Read all the Latest News , Trending News , Cricket News , Bollywood News , India News and Entertainment News here. Follow us on Facebook , Twitter and Instagram .