The History of SEO

The history of SEOSEO is an evolving science that combines programming, content, and keyword selection. Every second, 40,000 search queries are processed by Google, and 10,000 by Bing. All the major search engines are in competition with each other, and all work hard to provide effective search results.

All major search engines have developed algorithms that are designed to produce highly relevant results, and these algorithms are continually being updated. But where did it all begin?

Cat and Mouse

When the world wide web experienced its first massive expansion in the early 1990s, the internet was like a randomly-organised library. There was plenty of material to read, but finding precisely what you wanted was extremely difficult.

In 1990, the first index of internet material was created. It was called Archie. In 1991, the first search engine, Gopher, was born. The internet was first identified as having commercial potential in 1994, and this inspired a new drive to index the web.

At first, so few companies had websites that there was very little competition. If you wanted to find a product or service in your local area that could be found using the internet, it was very easy to do. Over the next four years, more and more commercial concerns found that they could attract customers by using the internet. Very soon, the competition among companies vying for the same business increased, and businesses started to use tricks to ensure a site appeared high in results.

Most of these “tricks” where not really helpful to users. They offered no benefit, and they were what we’d now call ‘black hat’ – unethical or artificial. Web pages would pad their content with non-displayed popular keywords, often written in the same colour font as the background. Due to the rather simplistic way in which websites were indexed, any site using this technique would appear high in search engine rankings.

Moving the Goalposts

Search engines began to change the way they indexed sites so that black hat techniques did not work. Black hat optimisers simply found a new way to trick them. In 1998, two Stanford University students named Larry Page and Sergey Brin decided to take a different approach to search engine rankings.

At the time, most search engines indexed and ranked sites down to the number of times a search term appeared within a page. Page and Brin decided to rank sites by the number of other, relative sites that linked to them. Their idea was simple: if a page on painting watercolours was linked to by a hundred or more related sites, it was more likely to be an authoritative site than one that hadn’t been back-linked to at all.

Page and Brin designed a clear, ad-free landing page for their new search engine and called it BackRub, later renaming it Google. Now, useless sites that were stuffed with keywords found themselves at the bottom of the pile, and sites with more links earned a healthy PageRank score.

At first, no one seemed to care. Google was a new player among other popular search engines at the time, such as Infoseek, Lycos, Webcrawler, HotBot and AltaVista. But Google’s effectiveness in producing desirable search results saw the site become the leader of the pack.

Let’s Google That

With Google’s rise in popularity, SEO experts had to find new ways to work against Google’s algorithm to artificially boost websites. People began to buy links. Google responded. It came up with ways of rewarding sites with valid links, and penalising sites with invalid links. For example: a bakery in New York and you were linked to by a site that posted videos of cats, then that would be an invalid link; your site would be penalised. If you were linked to a grocery store in New York that sold your products, then this would be a valid link, and your authoritativeness would increase.

Since 1998, Google has worked increasingly hard at making its algorithm increasingly sophisticated, penalising black hat SEO techniques and rewarding white hat SEO techniques. Now, webmasters must fill a site with relevant, well-written and non-keyword-stuffed content.

An alternative to SEO?

For a long time Google resisted the opportunity to make use of its popularity to make money from advertising, but it eventually succumbed. It began offering Pay Per Click (PPC) ads, where sponsored results appeared alongside search.

PPC is still a valid marketing route, but it’s best used in conjunction with SEO, rather than as a replacement. After all, a well optimised site often costs less when promoted using PPC. Increasingly, businesses are looking to implement a holistic content marketing strategy that features SEO at its core.

Modern SEO

Black hat SEO is becoming increasingly difficult. Google’s algorithm is so sophisticated that penalties take sites down within minutes. A new Google update strikes fear into unethical marketers and can wipe out a business in days.

Modern SEO techniques focus on quality content: original, authoritative blogs. Expanded, the content can be shared with social bookmarks, turned into attractive infographics (information that is presented in an easy-to-understand visual form), or posted as a review online. Manipulation always draws a blank in the end. Google changes its search algorithm over 500 times a year, and only announces big update changes. Effective SEO is about keeping abreast of every search engine and its algorithms, and optimising websites ethically without black hat techniques.

When, in 1989, Tim Berners-Lee came up with the idea of linking computers together globally to create the world wide deb, it’s unlikely he’d have thought his proposal would develop into a huge, global commercial network. Google itself has itself become a platform in which it seems that everyone has a voice, and it makes its money from the PPC ads businesses pay for. But whoever manages to appear first in search is the person who gets the most attention. Make sure your business is getting the attention it deserves. Contact us about our monthly SEO packages.

By | 2017-10-16T16:54:02+00:00 July 14th, 2017|SEO|0 Comments

About the Author: