History and Evolution of SEO
Search Engine Optimization.
This blog is about the history and evolution of SEO. The aim is to give a general picture and first hand information to the aspirants of SEO.
Introduction of SEO
Search Engine is a navigating program enabling to propagate, locate and collect inquired information from website network with the aid of some particular keywords and phrases. It helps to retrieve required pages and data either to the screen or saving for later use.
SEO is the process of improving both visual quality and content depth of the website, so as to reach higher ranking in the Search Engine Result pages(SERP).
SEO in general, adopts two methods, viz., on page optimization and off page optimization .
On-page optimization or on-site optimization is the way of optimizing elements inside your website to improve search engine rankings so as to impart better user experience.
Off-page optimization adopt strategies and tactics carry out outside your website to reach higher SE rankings, basically focusing on building back links, improving brand visibility and encouraging online engagement.
The history of search engines goes back to late nineties. In 1996, two students of Stanford University Larry Page and Sergey Brin began working on a search engine, called as BackRub as their research project. Later they renamed the project as Google. Google Search, the core product was launched in 1998. Yahoo was the predominant search engine during at that time. Only less than 1% of internet users depend on Google for collecting information from the network.
In 2001 September terrorist attack for destroying the World Trade Centre using hijacked flights, made enormous data enquiry in the web for detailed information about the incident. Google as a search engine almost failed to provide prompt and adequate knowledge to the data seekers. This made negative impact on the efficiency the reliability of Google. They sensed the reality and called up discussions to improvements. Programmers were asked to explain the failure for answering to web search at this critical juncture for which they reasoned that, most of the websites were not crawlable by google at that time.
Google as a search engine provide information from the web with the help of three processes viz, crawling, catching and indexing. Web crawling is the process of exploring websites to gather content information, creating appropriate database to store the data after proper indexing to retrieve the information as fast as possible at any instant.
Crawler programs are also known as spider, bots, robots etc. The aim of the program is to find out new pages as well as sites in the web and to scan the text and images content in it. The process of crawling start with verification of the site to collect some hind from its URL. Then going through the head section, title and meta tags, and finally through the body portion of the content for further strengthening.
Crawler will take a snapshot of the page and store locally so that they can be used without full round trip into the site. This saving process is called catching. Indexing is the activity of compiling the data into a single metric with different categorization to meet the users demand.
The question how to make other web sites more crawlable was answered as, the web sites can only made crawlable if the custodian of the websites known as Webmasters, permit to do so. For enabling the Webmasters to improve crawling process in their pages, Google decided to release a document on optimization practice, open to all. Accordingly, a 32 page document named as SEO Starter Guide was published by Google in November 2004. This documentation intend to provide best practices to make it easier for search engines to crawl, catch and index to understand the content of the web page on crawling. This guide describes all major aspects of SEO, terms and phrases (or keywords) which support enhanced and qualified traffic to reach higher ranking. Webmasters became aware of the advantages they get by modifying with SEO Starter Guide.
Webmasters became aware of the advantages they get by modifying with SEO techniques.in course of time. Enthusiastic practitioners to gain more from the practice tried to enhance their ranking without caring to upgrade the quality and reliability of their sites. Knowing this, Google made certain modifications to the guidelines for getting better results, by modifying their algorithms.
Content specific(Niche) SEO
In earlier stages Google made ranking of sites by content specific method. In this practice, Google search for the focusing keyword in a page or website to ascertain its ranking. The result is, the page which contain more keywords get better ranking than the pages with lesser focusing keywords.
So there become a competition among the sites to add more and more focusing keywords for better ranking, without any care for improving the content quality, attractiveness, impressiveness, user friendly or device compatibility. This is termed as keyword stuffing and called as a black hat SEO technique. Black hat SEO is the unethical way of work to raise the ranking of a site. To prevent the undesirable way of doing things Google made changes in their algorithms as link specific.
Link Specific SEO
The ranking of websites according to the hyperlinks they get from other web pages is termed as Link Specific evolution. These links obtained from other sites are considered as recommendation votes. So, the sites which receive more links are likely to be promoted to higher ranks by Google. Knowing this some websites tried to collect links from other sites even these sites have anything in common or similarity. Some sites with higher ranking even started selling links to others for financial gain. This gray practice was not welcomed by Google as all the modification were made for quality up-gradation and better search results. As such, Google revised their tactics to Quality Link specific.
Quality Link Specific SEO
Google introduced a scaling system called Page ranking, starting from 0 to 10 with predefined parameters to measure the quality and reliability of each websites under scrutiny. The mark with 0 rating in having only least count and having 10, consider as the most valued site. This practice failed to make a control over the selling of links one another. The modification called Passing the juice was introduced thereby to curtail such practices.
Passing the Juice.
In this, each link is assigned with certain value termed as equity. When a link is given by a high ranked website to a lower ranked site, the rank rating called equity will also be transferred to the receiver site, reducing its own rank. The value that one page or website can pass in to another page or website through a hyper link is defined as passing the juice. In another words, link juice refers to the authority or ranking power that a link can transfer from one page to another. There raised some instances where higher ranked sites to link with lower ranked sites with out transferring its equity. For this purpose noflow attribute is incorporated in passing the juice.
No flow attribute
rel = " noflow" attribute is the HTML tag attribute used on hyperlinks to instruct search engines not to use the link for page ranking calculations. The complete for of writing the link in the blog is
<a href= "https:''//www.google.com">google</a>
If this code is provided in a page, it is not visible to the users, but only be recognized by Google.
Anchor tag is a HTML code given in a page for hyperlinking one page to another. If you want to prevent the equity transfer while making link, modification made as the following.
<a href= "https:''//www.google.com" rel= "noflow">google</a>
In conclusion, SEO is a dynamic and a never ending process. SEO has undergone significant transformation in the course of time. Adopting differential thinking and approaches is inevitable for staying in forefront of the field.
Comments
Post a Comment