The evolving relationship between AI LLMs, search engines and SXO

In this article, I invite you to discover an overview of the evolution of search engines thanks to chatbots. You will also see the benefits this brings to users in terms of SXO across the purposes, functions, data sources, interactions, outputs, accuracy and reliability of current tools. 

Design by Freepik 

Let's first define some concepts to make this article easier to read:

  • The search engine are software programs that allow you to find the information you are looking for online using key words or phrases.
  • AI LLMs or in French large AI language models (also called chatbots) are a type of artificial intelligence program capable of, among other tasks, recognizing and generating text.
  • Le Sxo can be defined as the marriage between the SEO (Search Engine Optimization) andUX (User eXperience). Therefore, this concept combines search engine optimizations with strategies to improve the user experience.


What are the objectives and functions of these tools?

Search engines:

  • are designed to explore, analyze, index and restore information retrieved from the web.
  • provide links relevant to a user's query to web pages, documents, images, videos, etc.

The Major AI Linguistic Models:

  • are designed to generate human-like text based on the data they receive.
  • answer questions.
  • write content.
  • can make suggestions. 
  • help accomplish tasks.
  • do not carry out research on the web at the origin of their designs.
  • are Popular LLM providers and models such as : 
    • OpenAI (GPT-3.5/4 Turbo) – can access the internet but only for the Plus or Enterprise version of GPT-4.
    • Anthropic (Claude Instant AND Claude 2) – cannot access the internet.
    • Meta (Llama 2 70b) – cannot access the internet.
    • Google (PaLM 2) – can access the internet.


Where does the data for these tools come from and what are their sources?

Search engines:

  • continually crawl and index the web.
  • have the latest information available on a topic, provided the content is searchable and indexed.

The main linguistic models of AI:

  • are trained on a large dataset.
  • rely on the data they were last trained on.
  • do not have real-time information when they do not have access to the internet.
  • are not constantly trained on new datasets.


What are the possible interactions with these tools?

Search engines:

  • mainly offer one-way interaction : You enter a query and the search engine provides you with relevant links to content that may meet your information needs, and now with conversational results from Google and Bing.

The main linguistic models of AI:

  • are designed for more conversational interactions.
  • can engage in a back-and-forth dialogue. 
  • can generate text based on context.


What do we get as an output from this software?

Search engines:

  • provide links or references to external sources. Users must click on the sources and read them to find the specific answers they want.
  • présentent more current, better contextualized, better classified results, as well as better identification and filtering of spam. In short, we find a more complete, more informative and more personalized search experience since:
    • SGE (Search Generative Experience) and the incorporation of an extended language model (LLM).
    • Bing with the new generation of OpenAI (latest version of Open AI's ChatGPT 4).

The main linguistic models of AI:

  • provide generated text as a response and even images, sounds (words and/or musical instruments).
  • present a consistent, human-readable format.
  • have a quality which can be questionable and which depends greatly on the instructions given by the user (prompt).


How reliable and precise are the tools? 

Search engines:

  • provide direct links to sources, allowing users to verify the accuracy of information.
  • present an ordering and visibility of results that can be influenced by various SXO algorithms and strategies.
  • are able to provide more complete and informative answers to their users' search queries with LLMs. Advances in these could lead to a shift in market share to search engines, powered by the best of the big AI language models. Bing, however, still remains far behind Google.

The main linguistic models of AI:

  • present des risks of inaccuracies and errors in responses ; particularly if they have not been trained on recent data.
  • can be misled by disinformation and/or propaganda.
  • proposes a deployment which can be costly in terms of calculations.
  • may not understand the context of web pages as well as humans (hence recent research around Google's Infinite Attention: see the article).




LLMs have not yet had a significant impact on the search engine market. This is not threatened by LLM technologies, however it could possibly destabilize it. Google continues to dominate the global search engine market (at 90%). However, major linguistic models of artificial intelligence will continue to revolutionize the search engine industry in the future. Ultimately, these two technologies are clearly complementary since large AI linguistic models will always thirst for fresh information, in order to be more efficient. This is precisely the purpose of search engines to be able to provide information to their LLMs. 

  • In terms of updated information:
    • Search engines allow access to information.
    • LLMs use current information if they have access to it.
  • In terms of depth and breadth:
    • Search engines provide a range of sources, but some LLMs like PALM2 and Google Gemini provide sources from where they get their information. 
    • LLMs provide quick answers for in-depth research, but also diverse points of view on a specific topic.
  • In terms of use: 
    • Search engines are essential for research, news, and information discovery.
    • LLMs are perfect for conversational AI, coding, tutoring, and content generation.

Large linguistic models of artificial intelligence are used by search engines to improve their results since the future of research is certainly hybrid. However, it is very unlikely that they will replace search engines. 


And the SXO in all this? For me, this corresponds to the ability to understand how search engines integrate the major language models of artificial intelligence into their basic algorithms, in order to better contextualize their search results. This means that as SEO, we must increasingly give pride of place to user experience taking into account the user's journey from the search engine. It is also important to better integrate research intentions at each stage of your research, whether informational, navigational, commercial or transactional. 



Rossitza Mavreau, Lead Traffic Manager SEO SEA Analytics at UX-Republic