The Complete History of Search Engine Optimization (SEO)

Trying to reconstruct the history of SEO is like attempting to reconstruct the history of the handshake. We’re all aware that it exists, and that it’s a vital aspect of the business. We don’t spend much time thinking about its beginnings, though; we’re more concerned with how we utilize it daily.

But, unlike the handshake, SEO is still in its infancy and undergoes rapid modifications. It appears to be a millennial, with a birth year estimated to be around 1991.

And it’s matured and changed swiftly in its brief existence — just look at how many modifications Google’s algorithm has gone through.

So, how did SEO get started, and how did it become so important? Join us as we take a step back in time to try to find out what’s going on — it turns out to be quite a tale.

Search Engines

In 1945, the initial notion for developing a shared archive for all of the world’s data was born. In July of that year, Dr. Vannevar Bush, then-director of the now-defunct Office of Scientific Research and Development, published an article in The Atlantic proposing a “collection of data and observations, parallel material extraction from the existing record, and the final insertion of new material into the general body of the common record.” In other words, today’s Google, we believe.

Several decades later, in 1990, McGill University student Alan Emtage invented Archie, which some claim was the first search engine — but, according to Bill Slawski, president, and creator of SEO by the Sea, this is still up for debate. However, Archie was the “greatest way to find information from other servers around the internet at the time,” according to Slawski, and it is still in use (although in a basic version).

Several significant improvements occurred during the next decade, with the more commercial versions of search engines that we know today taking shape.

Six Stanford students found Architext in February 1993, which would subsequently become Excite, a search engine. Excite “revolutionized how information was cataloged,” according to some, making it easier to discover information “by sorting results based on keywords identified inside content and backend optimization,” according to Search Engine Land (SEL).

Matthew Gray releases World Wide Web Wanderer, subsequently known as Windex, in June 1993.

Martijn Koster launches ALIWEB in October 1993, allowing site owners to upload their pages (unbeknownst, sadly, too many site owners).

December 1993: JumpStation, RBSE spider, and World Wide Web Worm are three “bot-fed” search engines that were most likely driven by web robots to explore both servers and site content to create results.

The search engines Alta Vista, Infoseek, Lycos, and Yahoo are all launched in 1994.

1996: Larry Page and Sergey Brin, co-founders of Google, start working on a search engine that they initially called BackRub.

AskJeeves is launched in April 1997, subsequently becoming Ask.com.

Google.com is registered as a domain name in September 1997.

It’s worth mentioning that Microsoft debuted Bing about twelve years later, in June 2009 — prior incarnations were branded as Live Search, Windows Live Search, and MSN Search.

But this is where SEO comes into play. Site owners began to become aware as search engines got more mainstream and extensively used. “It was discovered that by executing some very easy acts, search engine results could be modified and money could be gained through the internet,” says SEO community Moz.

Those outcomes, on the other hand, were not particularly good. And there, dear readers is the start of the SEO narrative.

A Brief History of Search Engine Optimization (SEO)

The 90’s

Finding information became easier as search engines became household names and more families were linked to the Internet. The issue, as previously stated, was the information’s quality.

While search engine results matched words from user queries, they were usually limited to that, as a large number of website owners used keyword stuffing (repeating keywords over and over in the text) to improve rankings (for which there were no criteria), drive traffic to their pages, and produce appealing numbers for potential advertisers.

There was also some collaboration involved. According to SEL, people were utilizing excessive and “spammy backlinks” in addition to keyword stuffing to enhance their authority. Not only were there no ranking criteria in place at the time, but by the time search engines updated their algorithms to reflect this, new black hat SEO methods had emerged, which the changes failed to address.

But then two Stanford students had an idea.

That was one of the challenges Page and Brin set out to solve when they founded Google. In 1998, the team released “The Anatomy of a Large-Scale Hypertextual Web Search Engine” at Stanford, in which they wrote:

…advertising is the most common business model for commercial search engines. The advertising business model’s goals do not necessarily correlate to providing users with high-quality search results.”

Page and Brin initially mentioned PageRank, Google’s technology for ranking search results based on quality rather than keywords alone, in that same paper. Some may argue that this theory paved the way for SEO as we know it now.

Early 2000s

The beginning of the Google conquest occurred in the early 2000s. As part of its effort to make search engine technology less ad-centric, Google began to publish white hat SEO recommendations — the kind that the “good guys” follow — to assist websites to rank without engaging in any of the shady practices of the 1990s.

2000-2002

However, according to Moz, the criteria had little effect on ranking at the time, so people didn’t bother to follow them. This is partly because PageRank was based on the number of incoming links to a certain page; the bigger the number of inbound links, the higher the ranking. However, there was no method to verify the legitimacy of those links, and it was still feasible to employ backlinking strategies to rank pages that weren’t even linked to search criteria in the early 2000s, according to Marketing Technology Blog.

However, in 2001, Brin and Page appeared on “Charlie Rose” and were asked, “Why does it work so well?” by the host. Brin stressed in his response that Google was only a search engine at the time and that it was looking at “the web as a whole, not just which phrases occur on each page.” It set the tone for some of the first important algorithm modifications, which began to study certain words more closely.

2003-2004

With the “Florida” update to Google’s algorithm in November 2003, this approach to the web being about more than just words began to take shape. The response to Florida was large enough for Search Engine Watch to term it an “outcry,” although they were quick to point out that numerous sites benefited from the move as well. It was the first notable instance of sites being penalized for things like keyword stuffing, indicating Google’s focus on addressing problems for users first — mostly through high-quality content.

In 2004, one of the earliest versions of Google’s voice search was available as a half-finished experiment, according to the New York Times. While the technology was primitive at the time — just look at how the instructions appeared at first — it was a foreshadowing of mobile’s eventual relevance in SEO. (Stay tuned for more information on that.)

2005 was a pivotal year for SEO.

2005 was one of the most significant years in the history of search engines. In January of that year, Google teamed up with Yahoo and MSN to launch the Nofollow Attribute, which was designed to reduce the number of spammy links and comments on websites, particularly blogs. Then, in June, Google introduced customized search, which based results on a user’s search and browsing history.

SEO Upheavals in 2009

In 2009, there was a bit of a shakeup in the world of search engines. Bing was launched in June of that year, and Microsoft promoted it as the search engine that would generate considerably better results than Google. But, as SEL expected, it wasn’t a “Google killer,” and its content-optimization advice didn’t differ greatly from Google’s. In reality, according to Search Engine Journal, the only visible change was Bing’s preference for keywords in URLs, as well as capitalized terms and “big site pages.”

In August of that year, Google released a sneak peek of the Caffeine algorithm upgrade, asking for public feedback on the “next-generation infrastructure” that was “intended to expedite crawling, broaden the index, and merge indexation and ranking in near real-time,” according to Moz.

Caffeine wasn’t completely implemented until nearly a year later when it increased the search engine’s speed, but a real-time search was launched in December 2009, with Google search results integrating tweets and breaking news. It was a step that revealed SEO wasn’t just for webmasters any longer; from then on, journalists, web copywriters, and even social network administrators would be required to optimize material for search engines.

2010-Present

It’s kind of entertaining to see what Google suggests when you type in a search query. This is thanks to Google Instant, which was launched in September 2010. It caused SEOs to “combust” at first, according to Moz, until they learned it did not affect rankings.

But, like the growth of SEO from 2010 on, Google Instant was merely another chapter of the search engine’s quest to solve problems for users — despite some debate along the way about pages whose ranks were boosted by negative online reviews. According to Google, the algorithm was later tweaked to penalize sites that used such strategies.

In that same year, the relevance of social media content in SEO grew. Both Google and Bing incorporated “social signals” in December 2010, which displayed any written Facebook postings from your network that matched your query first. However, it began to assign PageRank to Twitter profiles that were frequently linked to. The value of Twitter in SEO doesn’t stop there; stay tuned.

The year of the panda, 2011.

The practice of penalizing websites that cheated Google’s algorithm will continue. Some of these occurrences were more prominent than others, such as one in 2011 involving Overstock.com. According to the Wall Street Journal, domains ending in.edu had a higher authority in Google’s opinion at the time. Overstock took advantage of this by requesting that educational institutions connect to its site and use keywords like “vacuum cleaners” and “bunk beds,” in exchange for discounts for students and professors. Those inbound links helped Overstock rank higher for searches containing those keywords until Overstock stopped doing it in 2011 and Google penalized them soon after.

Panda, the algorithm update that clamped down on content farms, was originally released in February of that year. Those were websites with massive amounts of often updated, low-quality content generated just to boost search engine rankings. They also have high ad-to-content ratios, which Panda has been taught to detect.

Panda has undergone several revisions, so many that Moz declined to include any that were not substantial after 2011 in its history of Google algorithm changes. Even with that exclusion, the timeline still displays twenty-eight panda upgrades from July 2015 to July 2016, the majority of which had a tough time measuring their impact.

In the year 2012, a penguin appeared.

With the first of several Penguin adjustments in April 2012, Google made “another step to reward high-quality sites” — and, in the process of announcing it, acknowledged Bing’s month-earlier blog post on the changing face of SEO. Penguin went after sites that utilized non-white hat SEO strategies in a subtle way, such as those with primarily helpful material but also spammy hyperlinks that have nothing to do with the page’s H1.

The “Above The Fold” upgrade, which began to decrease the ranks of sites with heavy ad-space above the “fold,” or the top half of the page, was a throwback to Google’s original anti-ad-heavy theory in 2012.

Google would eventually go beyond just targeting spammy content. The Payday Loan algorithm upgrade, which was first hinted at in June 2013 and then pushed out in May of the following year, was designed to focus on queries that were more likely to provide spammy results. These were usually searching for payday loans and other items that would make your mother blush. Google changed its ranking mechanism to help keep spam out of those results, and while this didn’t directly affect legitimate sites’ SEO efforts, it did show that it was trying to make search results genuine.

Google now has a local presence.

In 2014, Google launched “Pigeon” (called such by SEL) in keeping with the trend of animal-named algorithm changes, which had a significant impact on local search results. It appears to have been created at the time to improve Maps inquiries, which began to be treated with some of the same technologies as the company’s other search capabilities, such as “Knowledge Graph, spelling correction, synonyms.” Local searches were going to be big, and they will continue to be significant, as you’ll see in a minute.

After that, in 2015…

Google’s mobile change in April 2015, when non-mobile-friendly websites began to receive poorer results, may have been the most significant post-2010 SEO announcement. That meant SEO was no longer just about keywords and content, but also about responsive design.

In February 2015, Google revealed the change ahead of time, with a mobile-friendly test that allowed webmasters to see potential difficulties and make modifications before the release. Google’s mobile upgrades didn’t end there; in August 2016, it announced a crackdown on mobile pop-ups.

What Will Happen Next?

It may be difficult to believe, but it appears that even more change is on the way.

From desktop to mobile and beyond

Because mobile usage is on the upswing (51 percent of digital media is consumed on a mobile device versus 42 percent on a desktop), SEO will continue to tilt in that direction.

This is already evident in Google’s preference for a mobile-friendly user experience. We believe that voice search will be a big part of the next wave of SEO. This has a complicated history and is on the rise: voice searches account for 20% of Google searches and 25% of Bing searches. The development of voice-activated digital personal assistants like Amazon’s Alexa is exacerbating the problem.

While there may not be a clear-cut approach to optimize for voice search right now — owing to a lack of analytics in that area — we expect that those tools will become available shortly, adding yet another important pillar to SEO.

Keeping it local

However, this raises the issue of SEO localization or improving results to be regionally relevant. That’s especially true in the world of voice search, where sites like Yelp and other business aggregators are used to respond to questions about what’s nearby. Local businesses should take advantage of this SEO potential by ensuring their listings are “complete, accurate, and optimized to be referred” on a third-party site.

A Social Presence

While the launch of Google’s real-time search in 2009 had some social consequences, social media is now becoming a more important part of SEO strategy. When Google began indexing tweets in 2011, for example, it foreshadowed a future in which users search for content on social media in the same way they do on search engines. If you can conceive it, this indexing may be Google’s sort of future-proofing for a time when people don’t utilize search engines the way we do now.

Enter the name of any celebrity, such as Charlie Rose, whose video we previously released. His Facebook and Twitter profiles appear on the first page of search results for his name. In addition, look to the right in the biographical sidebar for social icons with links to his numerous networks. That’s one of the first things users want to notice while looking for someone.

In any event, it’s easy to see how SEO has evolved into a full-time profession. Its past will only continue to change in the future. Executing it well necessitates a high level of talent, ethics, and technological upkeep.

However, we recognize that having a single person dedicated to SEO is not always viable, which is why we continue to develop the greatest SEO learning resources we can.

You May Also Like

About the Author: Prak