Would a "sprinkle" term closely related to your target keyword improve rankings?These are the arguments for and against LSI (Latent Semantic Indexing) as a ranking factor.
Latent Semantic Indexing (LSI) is an indexing and information retrieval method for identifying patterns in relationships between terms and concepts.
Using LSI, mathematical techniques are used in text collections (index) to findsemantically relevantterms, otherwise these relationships may be hidden (orpotential)).
In this case, it sounds like SEO is very important.
After all, Google is a giant index of information, and we hear all kinds of things about semantic search and the importance of relevance in search ranking algorithms.
If you've heard rumors about latent semantic indexing in SEO, or been advised to use LSI keywords, you're not alone.
But does LSI really help improve your search rankings?let's see.
Claim: Latent Semantic Indexing as a Ranking Factor
The statement is simple: Optimizing web content with LSI keywords helps Google understand it better, and you'll rank higher.
Backlinko defines LSI keywords this way:
"LSI (Latent Semantic Indexing) keywords are conceptually related terms that search engines use to gain insight into what's on a web page.
By using contextual terms, you can deepen Google's understanding of your content.At least that's the story.
This resource goes on to provide some very convincing arguments for the LSI keyword:
- "Google relies on LSI keywords to understandsuch a deep controversy.
- "LSI keywords are not synonyms.Instead, they are terms closely related to your target keywords.
- "Google will not only boldly useTerms (in search results) that exactly match what you just searched for.They also boldly use similar words and phrases.Needless to say, these are the LSI keywords you want to spread into your content.
Does this practice of "sprinkling" terms closely related to your target keywords help improve your rankings with LSI?
Evidence of LSI as a ranking factor
Relevance was identified as one of five key factors that help Google determine which result is the best answer to any given query.
As Google explains in its "How Search Works" resource:
"In order to return relevant results for a query, we first need to determine what information you're looking for—the intent behind the query.
Once the intent is determined:
"...Algorithms analyze the content of web pages to assess whether the page contains information that may be relevant to what you are looking for.
Google goes on to explain that the "basic signal" of relevance is the presence on the page of a keyword used in a search query.This makes sense - if you're not using the keyword the searcher is looking for, how can Google tell you that it's the best answer?
Now, this is where some believe LSI comes into play.
If using the keyword is a signal of correlation, then usecorrect keywordMust be a stronger signal.
There are purpose-built tools specifically to help you find these LSI keywords, and believers of this strategy also recommend using a variety of other keyword research strategies to identify them.
Evidence against LSI as a ranking factor
Google's John Mueller made this very clear:
"...We have no concept of LSI keywords.So this is something you can completely ignore.
There is a healthy skepticism in SEO that Google might say something that will lead us astray in order to protect the integrity of the algorithm.So let's dig a little deeper.
First, it's important to understand what LSI is and where it comes from.
Latent semantic structure emerged in the late 20s as a method of retrieving textual objects from files stored in computer systems.As such, it is an example of one of the early information retrieval (IR-information retrieval) concepts available to programmers.
As computer storage capacity has increased and the size of electronically available datasets has grown, it has become more difficult to find exactly what to look for in that collection.
The researchers described the problem they were trying to solve in a patent application filed on September 1988, 9:
"Most systems still require the user or information provider to specify explicit relationships and links between data or text objects, making the system difficult to use or apply to large heterogeneous computer information files whose contents may be unfamiliar to users.
Keyword matching was used in IR at the time, but its limitations were evident long before Google came along.
Many times, the words a person uses to search for the information they seek do not exactly match the words used in the indexed information.
There are two reasons for this:
- Synonyms: Various words used to describe a single object or idea cause related results to be missed.
- Polysemy: Different meanings of a single word lead to irrelevant search results.
These are still issues today, and you can imagine how frustrating this is for Google.
However, the methods and techniques Google used to address correlation long ago shifted away from LSI.
What LSI does is automatically create a "semantic space" for information retrieval.
As the patent explains, LSI treats this unreliability of linked data as a statistical problem.
Without digging into the weeds, these researchers basically believed they could tease out a hidden underlying semantic structure from word usage data.
Doing so reveals underlying implications and enables the system to bring back more relevant results – andonlyThe most relevant results - even if there are no exact keywords.
Here's what the LSI process actually looks like:
The most important thing you should notice about the above diagram of this approach in the patent application: There are two separate processes taking place.
First, collections or indexes undergo latent semantic analysis.
Second, analyze the query and then search the processed index for similarities.
This is the fundamental problem with LSI as a Google search ranking signal.
Google's indexhaveHundreds of billions of pages and counting.
Every time a user enters a query, Google sorts its index in fractions of a second to find the best answer.
Using the above method in an algorithm would require Google to:
- Use LSA throughout the index recreate the semantic space.
- Analyze the semantic meaning of the query.
- Find the semantic meaning and documents of a query in the semantic space created by analyzing the entire indexall the similarities between.
- to these resultsSort and sort.
This is a gross oversimplification, but the point is that this is not a scalable process.
This is useful for small collections of information.For example, it helps to display relevant reports in the company's computerized archive of technical documents.
The patent application illustrates how LSI works using a collection of nine documents.That's what it's designed for. LSI is original in computerized information retrieval.
Latent Semantic Indexing as a Ranking Factor: Our Verdict
While the basic principle of removing noise by determining semantic relevance has certainly informed the evolution of search rankings since LSA/LSI was patented, LSI itself has no useful application in SEO today.
It hasn't been completely ruled out yet, but there's no evidence that Google ever used LSI to rank results.Google absolutely does not use LSI or LSI keywords to rank search results today.
Those suggesting LSI keywords are grabbing a concept they don't quite understand to explain why how words are related (or not) matters in SEO.
Relevance and intent are fundamental considerations in Google's search ranking algorithm.
These are the two big problems they try to solve to provide the best answer to any query.
Synonyms and polysemy remain major challenges.
Semantics—that is, our understanding of the various meanings of words and how they are related—is critical to producing more relevant search results.
But LSI has nothing to do with it.