“Eighty-nine percent of sites that ranked seven years ago are not ranking now.”
After stating that earlier this year at the Chicago SEJ Summit, Tober elaborated in an interview with Jim Boykin, saying that after he aggregated all domains
“that used to have at least one ranking in the past on keywords with at least a search volume of 10,” he checked to see how many of the domains still had at least one ranking.
The conclusion? In the top 30 results, Tober found that “Only 11% [of these domains] are left!”
Via this historical data, we can see that 89% of websites that ranked in the top 30 positions on Google seven years ago no longer rank for anything in the top 30 today.
The big question is why?
Recently, while doing a lot of technical website audits, it struck me that a lot of the websites that I am reviewing are seeing declining search engine traffic because of previous SEO tactics.
What is unusual about these sets of websites is that the SEO recommendations that were implemented were not black hat or spammy. They were part of the SEO Playbook at that time.
But a series of Google algorithm updates have not only negated these recommendations but are hurting the websites now.
As Hobo Marketing stated concerning the 2015 Google updates:
“Google algorithm changes in 2015 seem to focus on reducing the effectiveness of old-school [SEO] techniques. […] If your pages were designed to get the most out of Google, with commonly known and now outdated [SEO] techniques chances are Google has identified this and is throttling your rankings in some way. Google will continue to throttle rankings until you clean your pages up. […] If Google thinks your links are manipulative, they want them cleaned up, too.”
What are the outdated tactics that could be getting you in trouble today? Here are the three most common SEO tactics that I see hurting websites now:
1. Thin Content
An often-used SEO tactic was to focus on one keyword per page. So, if you sold widgets for example you might have the following pages:
- Widgets Main Page
- Blue Widgets
- Red Widgets
- Yellow Widgets
- Black Widgets
Often the only difference between the pages for Blue Widgets and Red Widgets were that one page contained the phrase “Blue Widgets” and the other contained the phrase “Red Widgets.”
The Google updates in 2015 emphasized themes instead of keywords. Instead of having all those sub-widget pages, what Google is looking for is one main widgets page. Google’s algorithms are smart enough to understand that if you have a widgets page talking about the availability of various colors, then “blue widgets” and “red widgets” are key phrases that are related to that page.
I see pages today that rank and drive traffic for hundreds of keywords with many of the hundreds not actually on the page but related to the content of that page.
For example, according to SEMRush, this page on the L’Oreal website ranks for 4,400 different keywords.
As the L’Oreal page shows, it’s not necessary to flush a page with a multitude of keywords or have a multitude pages for variations of the same product. One page thematically appropriate can rev the right traffic from Google.
Thin or duplicate pages are one of the biggest reasons I see for declining search engine traffic. If you are retaining any of these types of pages on your site, now is the time to update and show Google that your site is worth a high ranking.
2. Page Speed
For years, website owners never paid much attention to website speed. Flash, slideshows, video, and parallax scrolling were all added to websites to make them more closely resemble magazines and TV shows.
But in 2014/15 Google made it clear that bloated and slow loading websites created a bad user experience, and page speed became a larger component in Google’s ranking algorithm, especially when it comes to mobile.
A slow loading page increases your bounce rate and lowers your organic rankings, resulting in less search traffic. How fast does your page load?
Check the speed of your website here .
Does your website load in under 2 seconds?
3. Broken Links and Error Pages
A 404 or not found error is when a user (or search engine) clicks a link and lands on a page that no longer exists.
From a search engine perspective, 404 errors are wasted resources. But more importantly I have seen websites, like the one below, experience a significant decrease in search engine traffic after a jump in 404 errors.
And that makes sense. If you are Google and you keep crawling a website and finding pages that no longer exist, why would you put that site high up on the search results page?
The longer a website has been up the more likely they are to have broken links. And that is because over time employees come and go and product changes and website redesigns alter the URL structure. Have you experienced any of these over the last couple of years? If the answer is yes, what happened to those webpages?
Even on the best optimized website, users end up on pages that no longer exist. The key to a good user experience then is a useful 404 page.
Four ways to create a useful 404 page are:
- Tell visitors clearly that the page they’re looking for can’t be found
- Use language that is friendly and inviting
- Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site
- Consider adding links to your most popular articles or posts, as well as a link to your site’s homepage as Moz has below:
Whether or not your site has been impacted in a noticeable way by these Google updates, every site has things to clean up and to optimize in a modern way.
The sooner you understand why Google is sending you less traffic than it did last year, the sooner you can clean it up and focus on proactive SEO that starts to impact your rankings in a positive way.
Don’t let your site get bogged down by Google updates. Contact us today as see if your site meets Google’s Best Practices!