Today, search engine optimization (SEO) is heavily influenced by Google’s algorithms.
However, the approach known today as “search engine optimization” (SEO) predates the creation of the world’s most popular search engine, which was developed by Larry Page and Sergey Brin.
It is possible to say that search engine optimization and all things search engine marketing started with the publication of the first website in 1991, or even with the debut of the first online search engine in 1998, but the history of SEO “officially” starts a little later, around 1997.
As Bob Heyman, author of Digital Engagement, points out, it was the manager of the rock band Jefferson Starship who was responsible for the creation of a new area that would come to be known as “search engine optimization” in the future.
Because, you know, he was really dissatisfied with the fact that the official Jefferson Starship website was ranking on Page 4 of some search engine at the time, rather than on Position 1 of Page 1.
Although we may never know if this is a case of revisionist history or historical reality, all indications point to the phrase “SEO” being coined somewhere around 1997 as the genesis of the term.
According to the Internet Archive, John Audette of Multimedia Marketing Group used the word as early as February 15, 1997, when he worked for the company.
In 1997, the notion of ranking highly in search engines was still a relatively new one.
Additionally, it was heavily directory-driven.
The LookSmart search engine was powered by Zeal before DMOZ fuelled the initial Google categorization. Go.com had its own directory, and the Yahoo Directory was a prominent factor in the Yahoo Search engine before DMOZ.
When DMOZ (the Mozilla Open Directory Project) first launched (remember, Mozilla was a corporation and Moz was a brand long before SEOMoz), it functioned in the same way as the Yellow Pages for websites.
This is the premise on which Yahoo! was formed in the first place: the capacity to locate the finest websites available, as determined by editors.
Our customers had beautiful websites, but they didn’t get a lot of traffic. I started doing SEO in 1998 to help them.
I had no idea that would turn into a way of life for me.
However, the World Wide Web was still a new concept to most people at the time.
Today? Everybody wants to be the master of the search engine results pages (SERPs).
Search Engine Optimization vs. Search Engine Marketing: What’s the Difference?
Before it had its official name, Search Engine Optimization was called a lot of different things, like:
- Placement in search engines
- Positioning in search engines
- Obtaining a high ranking in search engines.
- Registration with a search engine
- Submission to search engines
- Promotion of a website.
Nonetheless, no debate would be complete without bringing up another phrase…
Search Engine Marketing (SEM)
The term “search engine marketing” was first used in 2001 by a well-known industry writer to describe the process of replacing search engine optimization.
Obviously, this did not take place.
Be prepared to witness a lot of bogus claims (e.g., “SEO is dead,” “the new SEO”), as well as efforts to rebrand SEO (e.g., “Search Experience Optimization”), in the coming months.
While the term “search engine optimization” isn’t ideal—after all, we aren’t optimizing search engines, we are optimizing our online presence—it has been the favored term in our field for more than two decades and is likely to continue to be so for the foreseeable future as well.
In terms of search engine marketing, it’s a no-brainer.
It is still in use, although it is now more linked with paid search marketing and advertising than with organic search marketing.
These two words are now coexisting amicably.
A Chronology of the History of Search Engines
In the last decade, search engines have revolutionized the way we obtain information, do research, shop for goods and services, amuse ourselves, and interact with others.
A search engine is at the heart of practically every online destination, whether it’s a website, blog, social network, or mobile application.
Search engines have evolved into the linking force and directing force for people in their daily activities.
But how did it all begin in the first place?
It’s been put together to help you learn more about the history of search engines and search engine optimization. This technology has become so important to us that we can’t live without it.
The beginning of SEO is referred to as “The Wild West” era
The search engine field was very competitive in the last decade of the twentieth century.
When people were looking for things, they could use a variety of search engines, including human-powered directories and crawler-based listings, with names like AltaVista and Ask Jeeves.
In the beginning, the only method to execute any kind of SEO activity was via on-page activities. Today, there are many more options.
This includes optimizing for a variety of criteria, including but not limited to:
- Making certain that the information was high-quality and relevant.
- Your HTML tags were correct in every way.
- You had both internal and external links on your website.
At the time, if you wanted to score highly in search results, all you had to do was repeat your keywords enough times across your websites and meta tags.
Do you want to outrank a website that contains a keyword 100 times in its content? Then you’d utilize the term “200 times” in your writing!
Spamming is the term used nowadays to describe this behavior.
Here are a few of the highlights:
Jerry Wang and David Filo, two Stanford University students, founded Yahoo in a trailer on the university’s campus. Yahoo! began as a collection of bookmarks and a directory of noteworthy websites on the Internet.
Webmasters had to manually submit their pages to the Yahoo directory for indexing in order for Yahoo to be able to discover them when someone ran a search in the Yahoo directory.
AltaVista, Excite, and Lycos were among the first search engines to go live.
Page and Brin were students at Stanford University when they came up with and tried out Backrub, a new search engine that ranked websites based on the relevance and popularity of links from other websites.
Google would eventually take over Backrub’s business. In addition, the HotBot, powered by Inktomi, was released.
Webmasters Guide to Search Engines, his first book, was very popular, so he started a website called Search Engine Watch to report on the search business, give advice on how to search the web, and give advice on how to rank websites better.
The next year, Sullivan started another prominent search magazine, Search Engine Land, which he continues to run today while working at Google.
In addition, Ask Jeeves made its debut, and the domain name Google.com was registered.
Goto.com debuted with sponsored links and paid search as its primary features. In order to appear above organic search results, advertisers placed bids on Goto.com, which was powered by Inktomi. Goto.com was eventually bought by the company Yahoo!.
DMOZ (the Open Directory Project) quickly rose to the top of the list of places where SEO professionals wanted to get their sites included.
Microsoft entered the area with MSN Search, which was powered by Inktomi at the time of its launch.
Search Engine Strategies (SES) was the world’s first conference dedicated exclusively to search marketing. Sullivan wrote a retrospective on the event, which you can see online.
(The SES conference series kept going under different names and with different parent companies until it was finally stopped in 2016.)
The Google Revolution is underway
In 2000, Yahoo made one of the worst strategic decisions in the history of search when it collaborated with Google and allowed Google to power its organic results instead of Inktomi, which was at the time the market leader.
Prior to it, Google was a relatively unknown search engine. This is a little-known fact!
As a consequence, every Yahoo search result now says “Powered by Google,” and the company has succeeded in exposing the world to its main competition, which has now become a household brand as well.
This is how websites used to be judged by search engines. They looked at the information they had on their pages, the domain names they used, the ability to be listed in the directories above, and how well they organized their site (breadcrumbing).
On the other hand, Google’s web crawler and PageRank algorithm were innovative in the field of information retrieval.
When ranking websites, Google uses both on-page and off-page variables, such as the number and quality of external links leading to a website, as well as the anchor text used.
It’s interesting to consider that Google’s algorithm was based on the premise that “if people are talking about you, you must be significant.”
SEO experts, on the other hand, thought that links were the most important part of Google’s overall ranking algorithm, even though they were only one part of it. This led to the creation of an entire sub-industry dedicated to link building.
The battle to gain as many connections as possible with the aim of improving one’s ranking evolved into a sport during the following decade.
Links have become a widely exploited method that Google will have to handle in the future, according to the company.
Another notable event from that year occurred with the introduction of the Google Toolbar for Internet Explorer, which allowed SEO professionals to see their PageRank score (a number between 0 and 10).
This marked the beginning of an age characterized by unsolicited link exchange request emails.
With PageRank, Google effectively inserted a measure of money into its linking practices. in the same way that domain authority is abused now.
Starting in 2000, Google added AdWords ads to its organic search results, which were shown next to them.
These paid search advertisements started showing above, below, and to the right of Google’s natural (i.e., unpaid) search results in the summer of 2011.
So in 2000, some webmasters met at a bar to start sharing information about search engine optimization with each other.
This casual meeting ultimately grew into Pubcon, a huge search conference series that is still in operation today and is still going strong.
Over the following months and years, the SEO community became used to a monthly Google Dance, or a period of time during which Google changes its index, which may result in significant ranking swings at certain periods of the year.
Although Google’s Brin was originally quoted as saying that the company did not believe in online spam, it is likely that his position had shifted by the time 2003 rolled around.
Following improvements such as Florida, SEO became much more difficult since it became far more crucial than just repeating keywords a certain number of times.
The Use of Google AdSense to Make Money from Terrible SEO Content
After purchasing Blogger.com in 2003, Google created AdSense, which allows publishers to display Google advertising that is contextually relevant to their visitors.
The combination of Google AdSense with Blogger.com resulted in an explosion of easy, monetized online writing and the beginning of the blogging revolution.
People who work for Google probably didn’t know this at the time they were making things more difficult for them.
AdSense spawned spammy practices and made-for-AdSense websites, which were packed with thin, poor, or stolen content and existed purely for the purpose of ranking high, attracting clicks, and earning money.
Local Search Engine Optimization and Personalization
The results for searches with a geographic purpose began to improve in about 2004 when Google and other leading search engines implemented this improvement (e.g., a restaurant, a plumber, or some other type of business or service provider in your city or town).
The Google Maps Plus Box was introduced in 2006, and I remember being extremely pleased with it at the time.
Also in 2004, Google and other search engines started to make more use of end-user data, such as search history and interests, to customize search results, a trend that has continued to this day.
This means that the results you see may be different from the results someone sitting next to you in a coffee shop saw when he or she searched for the same query on the same computer.
Nofollow tags, which were introduced in 2005 as a method of combating spam, were also introduced.
This tag was first used by SEO professionals as a means of sculpting PageRank.
Google has also released a few interesting changes, including the following:
- The addition of Jagger aided in reducing the amount of unsolicited link exchanges that were floating around, as well as announcing the decline in relevance of anchor text as a ranking criterion due to its vulnerability to corruption.
- The Big Daddy (a term created by Jeff Manson of RealGeeks) was a Google algorithm update that changed the architecture of the search engine to allow for a better understanding of the value and connection of links between websites.
YouTube, Google Analytics, and Webmaster Tools are all excellent resources.
In October 2006, Google bought the video-sharing site YouTube for $1.65 billion, making it the second most popular search site in the world, after Yahoo!
YouTube now has a user base of 2 billion people!
Because of its increasing popularity, video SEO has become more important for companies, enterprises, and individuals who want to be discovered online.
In 2006, Google also introduced two very essential tools: Google Scholar and Google Maps.
- Google Analytics is a web analytics service provided by Google. At the time it was made, this free, web-based tool was very popular, which led to downtime and maintenance alerts for webmasters.
- Google Webmaster Tools is a free service provided by Google. Website owners can now use Google Webmaster Tools, which is now called the Search Console, to see if their site is being crawled, see which searches their site was found in, and ask to be re-included.
In addition, 2006 was also the year in which XML sitemaps received widespread acceptance from search engines.
XML sitemaps let webmasters show search engines every URL on their site. This lets them show search engines every URL on their site.
An XML sitemap isn’t just a list of URLs. It also includes a lot of other information that helps search engines crawl the site more intelligently.
Search on the Internet for anything (Universal Search)
We started to see a big change in how search was done in 2007, with new and interesting features being added.
All of these changes were made with the goal of making the search experience better for users.
Let’s start with Google’s Universal Search, which is available in both English and Spanish.
At that moment, the search results consisted of 10 blue links, which had been the case until now.
After that, Google started mixing conventional organic search results with other sorts of vertical results, such as news, video, and photos, to provide a more comprehensive experience.
This was without a doubt the most significant adjustment to Google search (and SEO) since the Florida algorithm upgrade.
Getting the Cesspool in Order
In 2008, Eric Schmidt, then-CEO of Google, said that the Internet was becoming a muck and that brands were the answer. “Brands are the means through which you sift out the filth,” he said.
The result was Vince, a Google upgrade released less than six months after he made his statement.
Big companies suddenly seemed to be ranking much higher in the search engine results pages (SERPs).
According to Google, though, the program was not truly meant to reward companies.
Google wanted to place a higher emphasis on the algorithm’s ability to be trusted (and big brands tend to have more trust than smaller and less-established brands).
Following this update, Google released another one called Caffeine, which was meant to speed up the process of indexing websites.
“Caffeine was first revealed as a new search architecture for Google that was said to be faster and more accurate, as well as better and more relevant, while indexing more of the web,” said Search Engine Journal at the time.
Google said in 2010 that the speed of a website was a factor in how it was ranked in its search results.
Bing, and the Search Alliance
Microsoft Live Search was rebranded as Bing in 2009.
Next, in an effort to disrupt Google’s over 70 percent hold on the United States search industry, Yahoo and Microsoft joined together to cooperate on a 10-year search agreement (though it ended up being reworked five years later).
According to the Search Alliance, Microsoft’s Bing dominates both organic and sponsored search results on Yahoo!
Despite the fact that it elevated Bing to the obvious second-place ranking, they were eventually unable to overcome Google’s huge hold on search in the United States and throughout the world. After a period of transition, Bing will officially become Microsoft Bing in October 2020.
The Ascension of the Social Media
Late in the decade of the 2000s, another phenomenon began to emerge: social networks.
Although it would try again with the now-defunct Google+, Google placed its biggest bet on YouTube.
However, other social media platforms such as Facebook, Twitter, and LinkedIn have also emerged as prominent participants (with many more to come and go in the coming years).
With social media’s growth in popularity has come conjecture that social signals might have an influence on search results.
Yes, social media may assist with SEO, but only in an indirect manner, much as other types of marketing can assist in driving more visitors to your website and increasing brand recognition and affinity (which generates search demand).
It doesn’t matter how much Google has said social shares don’t matter for rankings over the years. Many ranking factor studies have found that social shares are strongly linked to higher rankings (like in organic search results), even though Google has said social shares don’t matter.
In case you’re interested in learning more about this subject, I strongly recommend reading How Social Media Helps SEO [Final Answer].
When schema markup, a kind of microdata, was released in 2011, it was intended to aid search engine algorithms in interpreting the context of a search query. On Schema.org, you may see examples of each schema markup type.
Schema does not play a role in the ranking process. Furthermore, there is no evidence that it has an effect on how well you search for things.
Schema, on the other hand, might help you stand out in the SERPs by providing rich and highlighted snippets.
A study by InLinks found that when websites used schema, their search engine rankings went up after they were put in place.
If you’re not sure whether or not you’ve implemented structured data correctly, you may run it using Google’s Structured Data Testing Tool.
A Panda and a Penguin from the Google Zoo
Both Google’s Panda and Penguin algorithmic upgrades, which were released in 2011 and 2012, had a significant influence on SEO that is still being felt today, as the search engine giant sought once again to clean up its search results and reward high-quality sites.
Google’s search results came under a lot of attention in 2011 because so-called “content farms” (websites that made a lot of low-quality content) were very popular in the search results at the time.
Scraper sites were also clogging up Google’s search engine results pages (SERPs). In some cases, scraper sites ranked higher than content creators in search results.
Thus, these websites were making a lot of money from ads (remember when I talked about Google’s own AdSense problem?).
These websites were likewise dependent on organic traffic from Google to survive and thrive.
However, when Google’s Panda update went live in 2011, many websites saw a significant amount of their traffic disappear overnight.
What constitutes a high-quality website, according to Google, has been revealed.
Panda was made to remove low-quality (or thin) content from the web. It was changed on a regular basis over many years, and in 2016 it became part of Google’s main algorithm.
While websites were still recuperating from the impact of Panda, Google launched a much-anticipated over-optimization algorithm, which was designed to eradicate “aggressive spam methods” from its search results and instead reward quality content.
Later known as Penguin, this algorithm targeted link schemes (websites with strange linking patterns, such as a large number of exact match anchor texts that corresponded to terms you entered) and penalized them.
Keyword stuffing (in the sense of cramming keywords that you intend to rank for)
Penguin wasn’t updated nearly as regularly as Panda, with some upgrades occurring more than a year after the last one. Also in 2016, Penguin joined Panda as a component of Google’s real-time search algorithm.
Things rather than strings
Google introduced the Knowledge Graph in May 2012, and it has already gained widespread use.
This was a big step away from just looking at keyword strings and toward understanding what words mean and why they were used.
At the time of the launch, Amit Singhal, Google’s former SVP of engineering, said that it was like this:
This tool lets you search for things, people, or places that Google knows about—landmarks, celebrities, cities, sports teams, and other physical aspects of the world; movies; celestial objects, works of art, and more—and immediately get information that is relevant to your question. “This is a key first step towards constructing the next generation of search, which taps into the collective intelligence of the web and understands the world a little more as humans do.”
Google boosted their search results using this information.
Knowledge panels, boxes, and carousels may appear anytime consumers conduct a search for one of the billions of entities and information in the Knowledge Graph.
It was in September 2013 that Google released Hummingbird, a new algorithm that was designed to better answer natural language queries and conversational searches. This was the next step in Google’s next generation of search.
With the emergence of mobile (and voice search), Google needed to entirely reinvent how its algorithms functioned to match the requirements of current searchers.
Hummingbird was regarded as the largest modification to Google’s core algorithm since 2001. Clearly, Google intended to give quicker and more relevant results, particularly to mobile users.
Starting around 2005 or so, one question was being addressed in our industry:
Is this the “Year of Mobile”?
Well, it turns out that it wasn’t in 2005.
It was also not 2007.
Or maybe it was in 2008.
Not even in 2010, when Google converted itself into a mobile-first firm.
Then 2011, 2012, 2013, and 2014 came and went.
Mobile was spoken about and much-hyped since it was expanding like crazy all this time.
As more consumers embraced cellphones, they were increasingly looking for companies and goods while on the road.
Finally, in 2015, we had the Year of Mobile—the moment at which mobile searches topped desktop searches for the first time on Google. And although this is true in terms of raw search statistics, it’s also true that search intent is significantly different and conversion rates remain far lower on mobile devices.
This was also the year when comScore revealed mobile-only internet users exceeded desktop-only users.
During the year of 2015, Google announced that it was going to make its search engine more “mobile-friendly.” The goal was to give people “the most relevant and timely results, whether the information is on mobile-friendly web pages or in a mobile app.”
In an effort to speed up websites, Google also created Accelerated Mobile Pages (AMP) in 2016.
The aim of AMP was to rapidly load material. Many news companies and publishers soon embraced AMP and continue to use it today.
And this may not surprise you, but in January 2017, Google revealed that page speed would now be a ranking consideration for mobile searches.
In the same month, Google indicated it would begin to devalue sites with obtrusive pop-ups.
In July 2019, mobile-first indexing was enabled for all new websites. And, by March 2021, all websites will have moved to mobile-first indexing.
Machine Learning and Intelligent Search
Earlier, I said that Google, initially focused on information retrieval, became a mobile-first firm.
Well, things changed in 2017 when Google CEO Sundar Pichai proclaimed Google a machine learning-first firm.
Today, Google search is meant to educate and help, rather than give people a list of links.
That’s why Google has embedded machine learning into all of its products—including search, Gmail, Ads, Google Assistant, and more.
In terms of search, we’ve already begun to see the influence of machine learning with Google RankBrain.
RankBrain, which was first announced in October 2015, was initially used to attempt to understand the 15% of queries that Google had never seen before, depending on the words or phrases the user had entered.
Since that time, Google has extended RankBrain to execute on every search.
While RankBrain effects ranking, it isn’t a ranking component in the classic sense, where you are rewarded with higher ranks for performing x, y, and z.
And there’s much more coming soon in the field of intelligent search.
- Voice searches are rising.
- Visual search has grown unbelievably good.
- Users (and brands) are increasingly embracing chatbots and utilizing personal assistants (e.g., Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana).
People who do SEO are going to have a lot more fun in the future because of new technology.
Google’s Core Updates
Google makes adjustments to its algorithm every day.
But, during the year, Google releases major changes when its algorithm changes.
There are also large basic algorithm modifications.
The goal of these basic changes is to make the search experience better for users by giving them more relevant and trustworthy results.
These Google core changes don’t target a single page or site but try to enhance how the system monitors information.
Here’s how Google defines these basic updates:
One way to conceive of how a core update functions is to assume you compiled a list of the best 100 movies in 2015. A few years later, in 2019, you revisit the list. It’s going to organically alter. Some fresh and excellent movies that have never existed before will now be contenders for admission. You could also review certain films and find they deserved a higher spot on the list than they had before. “
In March 2018, Google acknowledged a large core algorithm upgrade had been pushed out to boost “under-rewarded” sites.
A little over a month later, Google issued another major core algorithm upgrade geared toward content relevance.
Then, another large core upgrade came out in August (often incorrectly and inaccurately referred to as the “Medic” update), targeting sites with low-quality material.
In March 2019, as an extension of the August 2018 core update, Google revealed that a core update (a.k.a., Florida 2) was arriving, and it was meant to be a huge one.
However, the SEO community believed it was more of a reversal of earlier algorithms.
In addition, another significant core upgrade occurred in June 2019 that uncovered flaws in E-A-T on websites, with a focus on the authority and trustworthiness of inbound links.
Every now and then, Google will issue a large core update that has an influence on all search results globally.
For example, there was a major core upgrade in September 2019 that was targeted at increasing sites with overall ideal performance. And, another broad core change in January 2020 that targeted YMYL (your money, your life) categories.
That’s the major distinction between wide core updates and core updates—you need to assess your site generally, not a particular page.
Most recently, Google made a big change in May 2020. It targeted weak content landing pages and gave a boost to local search results.
BERT is the greatest Google algorithm upgrade since RankBrain.
BERT stands for Bidirectional Encoder Representations from Transformers used for natural language processing.
Essentially, it helps Google grasp the context of search queries better.
For example, the term “bat” might indicate a nocturnal winged mammal popularly associated with Batman. Or, it might be used as a baseball player gets up to bat.
With BERT, Google is able to analyze the context to give you better search results.
What makes BERT even more useful is that Google may now use the language surrounding your keywords to help its crawlers understand your content.
For example, “I went to the bat cave.” Or, “After my bat, I went into the dugout.” Google can now create a context model around other terms in the phrase. This is a critical aspect of how natural language processing identifies human conversation.
As Google’s Danny Sullivan said:
When it comes to BERT, there’s nothing to optimize for, and there’s nothing anybody should be considering. “The basics of our efforts to recognize and reward excellent material remain intact.”
Still interested in learning more about BERT? Dawn Anderson explains all you need to know about BERT in this comprehensive article.
If you’re like most people, you’ve seen highlighted snippets before without realizing what they were.
Featured snippets are small chunks of text, bullet points, data, or tables that show at the top of Google’s search results page (search results page).
This is accomplished by providing a straight answer to the searcher’s question without the need for him or her to go to a website or browse to another page of results.
However, highlighted snippets have the potential to be quite unpredictable, so proceed with caution.
Featured snippets are nothing new in the world of publishing. They were first seen in 2014, which is a long time ago.
The attraction of the coveted “position zero” has been reawakened by the introduction of featured snippets. This means that your search result will be shown above all other distractions in the SERPs, as well as appearing in the organic search results.
If you were in both featured snippet search results at the same time, Google changed this function in January 2020 so that only one search result would show up. If you were in both, you would only show up in one search result, not both.
Google also made another change in June 2020. They said that highlighted snippets would now send people to the material that was relevant to their search query instead of the search results page.
Users will now see that the content has been highlighted in yellow.
As voice search continues to develop its capabilities, highlighted snippets of material will become an increasingly important tool for increasing organic search exposure.
Since the 1990s, search engines and search engine optimization (SEO) have come a long way.
In this piece, we’ve just touched on a handful of the many methods.
The history of SEO has been filled with exciting twists and turns – the birth of new search engines, the death of old search engines, the introduction of new SERP features, the introduction of new algorithms, and constant testing and updating – as well as the appearance of great SEO publications, conferences, tools, and professionals.
While search engines and SEO have developed significantly over the years, one thing has remained constant: SEO will continue to be important as long as there are search engines.
And we haven’t even begun to scratch the surface!