ChatGPT Style AI Search Engines Could Cost Google, Microsoft Billions in Computing Cost

🕘 Posted on: February 22, 2023 | Last updated on: February 22, 2023
ChatGPT Style AI Search Engines Could Cost Google, Microsoft Billions in Computing Cost

Technology industry executives are wondering how to use ChatGPT and other AI while taking the hefty expenses into consideration. As Alphabet moves beyond the chatbot blunder that reduced its market value by $100 billion (approximately Rs. 8,29,000 crore), another issue with its plans to integrate generative AI into popular Google Search is becoming apparent: the price. Technology industry executives are wondering how to use ChatGPT and other AI while taking the hefty expenses into consideration. According to the startup's CEO Sam Altman on Twitter, the massively popular chatbot from OpenAI, which can write sentences and respond to search inquiries, has "eye-watering" computational expenses of a few or more cents every session. John Hennessy, the chairman of Alphabet, told Reuters in an interview that having A huge language model interchange with AI will probably cost ten times as much as a typical keyword search, while fast cost reduction will come through fine-tuning. As Alphabet moves beyond the chatbot blunder that reduced its market value by $100 billion (approximately Rs. 8,29,000 crore), another issue with its plans to integrate generative AI into popular Google Search is becoming apparent: the price. Technology industry executives are wondering how to use ChatGPT and other AI while taking the hefty expenses into consideration.According to the startup's CEO Sam Altman on Twitter, the massively popular chatbot from OpenAI, which can write sentences and respond to search inquiries, has "eye-watering" computational expenses of a few or more cents every session. Alphabet's Chairman John Hennessy told Reuters in an interview that while fine-tuning would help swiftly lower the cost, having a dialogue with AI known as a big language model will likely cost 10 times more than a typical keyword search. The computational power required makes this type of AI more expensive than traditional search.

Such AI relies on processors costing billions of dollars, which must be spread out over their several-year useful lifespan, according to experts. A further expense and stressor is electricity.help companies seeking to lessen their carbon footprint. "Inference" is the method of handling AI-powered search queries, in which a "neural network" loosely based on the biology of the human brain infers the response to a question from earlier training. In contrast, Google's web crawlers have combed the internet to create an information index when conducting a standard search. Google responds to user queries by displaying the most pertinent information that has been saved in the index.It's inference costs you have to bring down, according to Alphabet's Hennessy, who described the issue as "a couple-year challenge at most." Notwithstanding the cost, Alphabet is under pressure to accept the challenge. With senior executives aiming to compete with Google's estimated 91 percent search market dominance, its competitor Microsoft conducted a high-profile event at its Redmond, Washington, headquarters earlier this month to showcase ambitions to integrate AI conversation technology into its Bing search engine. A day later, Alphabet discussed intentions to enhance its search engine, but a promotional film for its AI chatbot Bard showed the system providing an incorrect response, sparking a price decline that reduced its market worth by $100 billion. Later, Microsoft attracted its own criticism when The corporation apparently limited lengthy chat conversations because they "provoked" unexpected responses after its AI allegedly made threats or claimed love to test people.

Amy Hood, the chief financial officer of Microsoft, told analysts that when the new Bing was made available to millions of users, benefits from increased usage and ad income exceeded costs. Even at the cost to serve that we are talking about, she continued, "that's more gross margin money for us." Moreover, Richard Socher, CEO of search engine You.com, a competitor of Google, claimed that the cost of introducing an AI chat feature as well as programmes for charts, movies, and other generative technologies increased by 30% to 50%. In the long run, he added, technology becomes more affordable. While efficiency and utilisation of chatbots vary greatly depending on the technology, a source close to Google cautioned that it is still too early to estimate how much they would cost.goods like search are already powered by AI and engaged. Paul Daugherty, chief technology officer at Accenture, claims that paying the bill is one of the two key reasons why social media and search firms with billions of users haven't launched an AI chatbot overnight. The first is precision, and the second is scaling this properly, he added.Putting the math to use: Researchers at Google and other institutions have been looking on ways to train and operate huge language models more affordably for years. Larger models cost more since more inference chips are needed. The so-called parameters, or various values that the algorithm takes into account, that make up the AI that dazzles customers with its human-like authority, have exploded, reaching 175 billion for the model OpenAI updated into ChatGPT.

The price is also affected by the query length, which is expressed in "tokens," or discrete words. According to a prominent technology executive, such AI is still too pricey to be made available to millions of people. Because these models are so expensive, the next round of innovation will focus on lowering the price.so that we can apply it in any application, of both training these models and inference," the executive added on the condition of anonymity. According to a person familiar with the project, computer scientists at OpenAI have for the time being discovered a way to optimise inference costs using complicated code that speeds up processors. A representative for OpenAI did not respond right away.How to reduce the number of parameters in an AI model by 10 or even 100 times without compromising accuracy is a longer-term challenge. The best technique to eliminate (parameters) is still up for debate, according to Naveen Rao, who oversaw Intel Corp.'s efforts to develop AI chips and is now working with his business MosaicML to lower the price of AI computing. Some others have thought of charging for access in the interim, such as OpenAI's $20 (about Rs. 1,660) per month membership for improved ChatGPT service. Technical experts also suggested a fix, which Alphabet is investigating, which involves using smaller AI models for easier jobs. The business said this month that its chatbot Bard will be powered by a "smaller variant" of its powerful LaMDA AI engine, requiring "much less computational resources, enabling us to" to expand to other users Hennessy responded to a question regarding chatbots like ChatGPT and Bard by saying that more targeted models, as opposed to one system handling everything, will help "tame the cost" at a conference dubbed TechSurge last week.

 

Related Stories