Are Your Outdated SEO Tactics Costing You Traffic?
Via this historical data, we can see that 89% of websites that ranked in the top 30 positions on Google seven years ago no longer rank in the top 30 today. The big question is why?
While doing some technical website audits recently, it struck me that a lot of the websites that I am reviewing are seeing declining search engine traffic because of previous Search Engine Optimization tactics.
What is unusual about these sets of websites is that the SEO recommendations that were implemented were not black hat or spammy. They were part of the SEO Playbook at that time.
But a series of Google algorithm updates have not only negated these recommendations but are hurting the websites now.
Panda: Assigned lower ranking to “low-quality” websites. In other words, sites that provide bad user experiences because of low quality, poorly written content, or bad navigation move to the bottom of search results.
Penguin and SEOPenguin: Removed “over-optimized” websites that didn’t deserve high rankings. In an effort to boost SEO, companies would buy multiple website domains and then create websites with a blog or two that linked back to the main website. Google released the Penguin update to stop this practice and ensure good user experiences.
Hummingbird and SEOHummingbird: Adapted Google’s algorithm to manage smartphone searches and voice recognition applications. These major updates now comprise the new playbook for optimizing websites. If you want to rank well on Google queries, make sure you follow the new rules and drop outdated practices.
What are the outdated tactics that could be getting you in trouble today? Here are the three most common SEO tactics that I see hurting websites now:
1. Thin Content
An often-used SEO tactic was to focus on one keyword per page. So, if you sold widgets for example you might have the following pages:
● Widgets Main Page
● Blue Widgets
● Red Widgets
● Yellow Widgets
● Black Widgets
Often the only difference between the pages for Blue Widgets and Red Widgets were that one page contained the phrase “Blue Widgets” and the other contained the phrase “Red Widgets.”
Starting with the panda update, Google now emphasizes themes instead of keywords. Instead of having all those sub-widget pages, what Google is looking for is one main widgets page. Google’s algorithms are smart enough to understand that if you have a widgets page talking about the availability of various colors, then “blue widgets” and “red widgets” are key phrases that are related to that page.
I see pages today that rank and drive traffic for hundreds of keywords with many of the hundreds not actually on the page but related to the content of that page.
For example, according to SEMRush, this page on the L’Oreal website ranks for 4,400 different keywords.
As the L’Oreal page shows, it’s not necessary to flush a page with a multitude of keywords or have a multitude pages for variations of the same product. One page thematically appropriate can rev the right traffic from Google.
Thin or duplicate pages are one of the biggest reasons I see for declining search engine traffic. If you are retaining any of these types of pages on your site, now is the time to update and show Google that your site is worth a high ranking.
2. Page Speed
For years, website owners never paid much attention to website speed. Flash, slideshows, video, and parallax scrolling were all added to websites to make them more closely resemble magazines and TV shows.
But in 2014/15 Google made it clear that bloated and slow loading websites created a bad user experience, and page speed became a larger component in Google’s ranking algorithm, especially when it comes to mobile.
A slow loading page increases your bounce rate and lowers your organic rankings, resulting in less search traffic. How fast does your page load?
3. Broken Links and Error Pages
A 404 or not found error is when a user (or search engine) clicks a link and lands on a page that no longer exists.
From a search engine perspective, 404 errors are wasted resources. But more importantly I have seen websites, like the one below, experience a significant decrease in search engine traffic after a jump in 404 errors.
And that makes sense. If you are Google and you keep crawling a website and finding pages that no longer exist, why would you put that site high up on the search results page?
The longer a website has been up the more likely they are to have broken links. And that is because over time employees come and go and product changes and website redesigns alter the URL structure. Have you experienced any of these over the last couple of years? If the answer is yes, what happened to those webpages?
Even on the best optimized website, users end up on pages that no longer exist. The key to a good user experience then is a useful 404 page.
Four ways to create a useful 404 page are:
● Tell visitors clearly that the page they’re looking for can’t be found
● Use language that is friendly and inviting
● Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site
● Consider adding links to your most popular articles or posts, as well as a link to your site’s homepage as Moz has below:
Whether or not your site has been impacted in a noticeable way by these Google updates, every site has things to clean up and to optimize in a modern way.
The sooner you understand why Google is sending you less traffic than it did last year, the sooner you can clean it up and focus on proactive SEO that starts to impact your rankings in a positive way.
Don’t let your site get bogged down by old school SEO tactics. Contact us today as see if your site meets Google’s Best Practices!
SEO v PPC: Which Is The Best Driver Of Website Traffic? When it comes to driving traffic to your website, there are really only two
How Many Social Media Platforms Should You Focus On? One of the challenges that a lot of B2B companies struggle with is what how should