1. Google sends out web crawlers known as Google bots that go through all the web pages on the internet.
  2. These bots are made up of millions of computers that surf the web much like how people do, except they do it a lot quicker. To make sure that they have the most current versions of all the websites the Google bots are working all the time crawling and re-crawling.
  3. How often the site is crawled depends on how quickly the its content changes eg. A newspaper website will get crawled more often than a static website. Web masters can also indicate to Google how often their sites should be crawled.
  4. The bots also detect all the links on a website which are then put into a queue system for crawling later. These links are important in deciding the search rankings of a site.
  5. Copies of all these web pages are stored in Googles’s index database. The index works like a contents page of a book and contains all the possible search terms.
  6. When a user types a query in the search engine the index matches the query with all the web pages in which the query terms appear and then grab the pages for the search results.
  7. At the same time a short description is also generated for each page in the search result.
  8. When pages are ranked in terms of relevance before they are displayed on the results page. Google consider some 200 factors when ranking sites. One of them is PageRank which takes into consideration how many sites are linked to a webpage and the quality of the linking sites