AI-infused searches on the Web: Are there new costs to users?

Img
Reading Time: 6 min read

By Dr Venkataraman Balaji
Vice President, COL

In the November 2022 TechTrends column, we postulated that at least some of the online search engines could become paid services in the near future. A major development that uses AI in a big way has accelerated this trend.

The unprecedented rise in popularity of ChatGPT, a conversation generator built on a large language model, has led to global technology leaders adding conversational facilities to the delivery of search results. Microsoft has launched a beta version of BingAI — its new search service infused with text generation technology from Open AI, the owner of ChatGPT, although this version uses a more advanced and refined language model. The chat function in BingAI can receive queries and offer search results in a conversational mode. It offers links to its sources of information while also suggesting further queries to ask.

The delivery of search results in a conversational mode has definite appeal. Google, the current global leader in the search services market, has launched Bard as a trial in the USA. Bard — an AI-infused search service built on Google’s own conversation generator, called LaMDA. Baidu, a giant in search services in China, has launched Ernie bot, its AI-infused conversational search service, in March 2023.

But conversational search services can deliver inaccurate results. A detailed analysis of the BingAI prototype released recently demonstrated inaccuracies in the conversational output, while some results were found to be entirely fictitious.1

Overcoming these limitations requires further training and refinement of the models, which could be expensive. A recent reasonably argued assessment of how much it could additionally cost a company like Google to shift all its search queries to a conversational mode was estimated at a staggering USD 36 billion, on top of an extraordinary capital expense of about USD 100 billion.2 Given its magnitude, such an investment may not take place. It is clear, however, that emerging pressures in the search market will require additional financial outlays by the major global tech firms.

Advertisements are the major source of revenue for search service providers. Transitioning to the conversational paradigm will most likely have an impact on this stream. Senior business executives have opined that it is unclear how the revenue stream could continue within this new paradigm.3 BingAI is already limiting the number of daily queries per user, and the cost factor is one very large consideration.

Quite likely, users will be required to pay for conversational search services. ChatGPT already offers a monthly subscription for a “turbo” service. Khan Academy, a popular source of OER, has announced a new layer called Khanmigo powered by generative AI (GPT-4) as a paid service.

The prevalence of fee-based searches will reduce the incentive to locate OER. So, it is necessary to develop applications using open-source language learning models for OER searches or to offer students support in blended learning. Models such as GPT-J, GPT-neo or BLOOM may present suitable solutions for the education and training sector in the future.

 

1 https://dev.to/ruochenzhao3/can-chatgpt-like-generative-models-guarantee-factual-accuracy-on-the-mistakes-of-microsofts-new-bing-111b

2 https://www.semianalysis.com/p/the-inference-cost-of-search-disruption

3 https://fortune.com/2023/02/08/the-new-bing-is-out-a-microsoft-exec-weighs-in-on-how-it-will-make-money/

Subscribe To Our Newsletter

Sign Up Now