Google, Bing and Yahoo are the most prevailing search engines at the moment, although there are many others such as MSN and AltaVista, and many specialist ones such as BestBets which specializes in finding medical information and is widely used by professionals when they need to refer to specified sources for information. Basically, however, they are web pages where a user can enter keywords which the search engine uses to trawl through websites until it finds the requisite ‘hits’. These are then revealed on the Google web page as we know it, in a list of web pages and URL links. The search engine uses a ‘spider’ which ‘reads’ information from websites it comes across via keywords placed by the user and allocates back links. In the case of Google, the back link algorithm is PageRank, although this is now incorporated with CIRCA technology ranking each website more effectively.

An inbuilt ranking system categorizes these keywords into a list, showing the most relevant at the top and working its way down to the least relevant at the end of a huge number of pages. That was fine until the LSI concept was introduced. The Latent Semantic Indexing is the new algorithm that has stealthily been introduced to maximize search engine optimization. It certainly makes much more sense and is built from the number of synonyms the algorithm finds in the actual content of each website but, it very cleverly incorporates those keywords and retains them in the context that a human being would use them in. Very clever, especially when many marketers are not even aware of this new algorithm and continue to chase link building and to join link exchanges quite unnecessarily.

Google actually began developing the LSI algorithm in 2003 when it took over the company, Applied Semantics which utilizes CIRCA technology, designed to replicate the patterns of human thought. The LSI algorithm takes into account the overall structure of the website and incorporates the theme, tone and syntax into the equation, searching for a pattern amongst these and then incorporating these, together with keywords and any synonyms it finds, to maximize rankings.

Some sites which were previously at the top of Google’s tree have completely fallen off the branches and have been evicted, while other sites now only achieve minimum exposure. The problem lies not with Google but with failing to incorporate true organic search engine optimization into the content of each website. Basically, to achieve maximum status with Google now, it is necessary to rewrite many previously successful websites, focusing on a good SEO understanding of the latent semantic indexing algorithm.

Cookie Policy

This website uses cookies that are necessary to its functioning and required to achieve the purposes illustrated in the privacy policy. By accepting this OR scrolling this page OR continuing to browse, you agree to our Privacy Policy