Using AI like ChatGPT costs more than other web searches. It’s a billion dollar problem

Estimated reading time: 5-6 minutes

MOUNTAIN VIEW, Calif. — As Alphabet looks past a chatbot flub that helped erase $100 billion from its market capitalization, another challenge arises from its efforts to add generative artificial intelligence to its popular Google Search: the cost.

Tech industry executives are talking about how they can use AI like ChatGPT while considering the high cost. OpenAI’s wildly popular chatbot, which can draft prose and answer queries, has “eye-popping” computing costs of a few cents or more per conversation, the startup’s CEO Sam Altman said on Twitter.

In an interview, Alphabet chairman John Hennessy told Reuters that having an exchange with AI, known as a large language model, probably costs 10 times more than a standard keyword search, though fine-tuning will quickly help lower costs.

Even with revenue from potential chat-based search ads, the technology could add to the bottom line of Mountain View, California-based Alphabet with several billion dollars in additional costs, analysts said. Net income was nearly $60 billion in 2022.

Morgan Stanley estimated that Google’s 3.3 trillion searches last year cost about a fifth of a cent each, a number that would rise depending on how much text AI needs to generate. For example, Google could expect a $6 billion cost increase by 2024 if ChatGPT-like AI treated half of the questions it receives with 50-word answers, analysts predicted. Google is unlikely to need a chatbot to handle navigational queries for sites like Wikipedia.

Others arrived at a similar bill in different ways. For example, SemiAnalysis, a research and consulting firm focused on chip technology, said adding ChatGPT-like AI to search Alphabet could cost $3 billion, an amount limited by Google’s internal chips called Tensor Processing Units, or TPUs, along with other optimizations. .

A ‘neural network’

What makes this form of AI more expensive than conventional search is the computing power involved. Such AI relies on billions of dollars in chips, a cost that must be spread over their multi-year useful lives, analysts said. Electricity also adds cost and pressure to businesses with carbon footprint goals.

The process of handling AI-driven queries is known as “inference,” where a “neural network” loosely modeled on the biology of the human brain infers the answer to a question from previous training.

In contrast, in a traditional search, Google’s web crawlers scanned the internet to compile an index of information. When a user types in a query, Google displays the most relevant answers stored in its index.

Alphabet’s Hennessy told Reuters, “It’s the inference costs that you have to cut,” calling that “a problem for a few years at worst.”

Alphabet is under pressure to rise to the challenge despite the cost. Earlier this month, its rival Microsoft held a high-profile event at its Redmond, Washington, headquarters to show off plans to embed AI chat technology into its Bing search engine, with top executives aiming for Google’s 91% search market share, according to Relatedweb’s estimation.

Unintended answers

A day later, Alphabet talked about plans to improve its search engine, but a promotional video for its AI chatbot Bard showed the system answering a question inaccurately, leading to a stock market crash that shaved $100 billion off its market value.

Microsoft itself later drew attention when its AI allegedly uttered threats or expressed love to test users, prompting the company to limit lengthy chat sessions, it said it “provoked” unintentional replies.

Microsoft’s Chief Financial Officer Amy Hood has told analysts that the benefit of gaining users and ad revenue outweighed the cost as the new Bing rolls out to millions of consumers. “That’s incremental gross margin dollars for us, even at the expense of the service we’re talking about,” she said.

And another Google competitor, CEO of search engine You.com Richard Socher, said adding an AI chat experience and applications for charts, videos and other generative technology increased costs by 30% to 50%. “Technology gets cheaper at scale and over time,” he said.

A source close to Google warned that it’s still early days to pinpoint exactly how much chatbots might cost, as efficiency and usage vary widely depending on the technology involved, and AI is already enabling products like search.

Still, footing the bill is one of the top two reasons why search and social media giants with billions of users haven’t rolled out an AI chatbot overnight, says Paul Daugherty, Accenture’s chief technology officer.

“One is accuracy, and the second is you have to scale this appropriately,” he said.

Making the math work

For years, researchers at Alphabet and elsewhere have studied ways to train and run large language models more cheaply.

Larger models require more chips for inference and therefore cost more. AI that dazzles consumers because of its human authority has grown in size, reaching 175 billion so-called parameters, or various values ​​that the algorithm takes into account, for the model that OpenAI has updated to ChatGPT. The cost also varies with the length of a user’s query, as measured in “tokens,” or chunks of word.

A senior technology executive told Reuters that such AI remained prohibitively expensive to put into the hands of millions of consumers.

“These models are very expensive, and so the next level of invention will be to reduce the cost of both training these models and deriving them so that we can use it in any application,” the director said on condition of anonymity.

For now, computer scientists within OpenAI have figured out how to optimize inference costs through complex code that makes chips work more efficiently, said one person familiar with the effort. An OpenAI spokesperson did not immediately comment.

‘An open question’

A longer-term problem is how to reduce the number of parameters in an AI model by 10 or even 100 times without sacrificing accuracy.

“How to most effectively strip away (parameters) is still an open question,” said Naveen Rao, who formerly led Intel’s AI chip efforts and now works to reduce AI computing costs through his startup MosaicML.

In the meantime, some have considered charging access such as OpenAI’s $20 per month subscription for a better ChatGPT service. Technology experts also said one solution is to apply smaller AI models to simpler tasks, which Alphabet is exploring.

The company said this month that a “smaller model” version of its massive LaMDA AI tech will power its chatbot Bard, requiring “significantly less computing power, allowing us to scale to more users.”

When asked about chatbots like ChatGPT and Bard, Hennessy said at a conference called TechSurge last week that more focused models, rather than one system that does it all, would help “tame costs.”

Contributing: Greg Bensinger

Related Stories

Latest business stories

Jeffrey Dastin and Stephen Nellis

More stories you might be interested in


Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *