- December 1, 2018
- By admin
- Digital Marketing
Search engines need to process millions of websites and tens if not hundreds of million webpages whenever any web search is made. The key to mastering SEO is knowing how a search engine reads your website’s content and processes it for a relevant web search.
One problem in trying to understand how a search engine works is the fact that they continuously change. At the beginning of 2011, for example, Google took into account the size of a website, the number of pages it had, and how many keywords were stuffed into those pages. There were complaints and within a few months, the panda update recalculated search results to take into account perceived quality. This meant a number of sites that once ranked high, now ranked a lot lower on Google.
The first task of a search engine is to find the websites and webpages it needs to index. This is done using a type of software robot called a spider. For example, if you run a WordPress-based website, you will be able to track how many spiders have latched onto your site by using a plugin such as StatPress Reloaded.
Each search engine, when a user makes an enquiry, refers to the word lists compiled by the spiders. Some search engines like AltaVista will list every word on every page, while others, like Google, list only tags and Meta tags. Others will base their decisions on keywords.
An important way a spider moves from one website to another is by links. When a website is linked to by another, the spider follows the link and then records all the information on the new site. This is why this book extensively covers the use of backlinks when building a website (see Part 2).
Once a search engine has an index it then ranks the sites and pages. This is done by looking at keyword density or it might be based on a system of weighting. The latter varies from search engine to search engine, but is used as a quality control filter. This means some search engines might give better index ranking positions to academic sites than say to blogs on the same subjects. Weighting can also be applied negatively to sites that the search engine deems to be a spam site, a copy site or to sites that aim to distort their position by posting masses of cheap, badly written content.