This article provides a helpful guide to checking your website for potential over-optimization and tactics that may be causing more harm than good in the search engines.
With the frequent deployments of Google's Panda and Penguin algorithm updates, many SEOs and website owners are scrambling to keep up with the search engine's mood swings and avoid penalties for tactics that used to be legitimate. Many of the algorithm updates seem to center around the over-optimization, or "over-SEOing" of websites, to prevent websites with poor content but lots of search engine optimization from out-ranking sites that have higher quality, deeper content but may not have the same SEO resources. But how do you know if your website is over-optimized and a potential target for Google's frequent algorithm updates? Where is the line drawn between making your website search-friendly and over-optimizing the page and attracting penalties?
Too Good to be True
The short answer is that if you are focusing on SEO instead of producing the highest possible quality content for your site visitors, and if you are chasing only traditional, tunnel vision SEO metrics like ranking #1 for a single short-tail keyword, there is a good chance you may be over-optimizing your website with your tactics. One basic piece of advice I give to my clients is that if the tactic is "too easy" (like buying thousands of backlinks, or stuffing the exact keyword on the page 25 times within 150 words of content) then it not only fails to accomplish the goal at hand, it likely does more harm than good. These may have worked 5 to 10 years ago, but Google and other search engines are extremely sophisticated, and getting smarter every day. It's naive to think that these outdated, junky tactics work anymore.
Is Your Website Over-Optimized?
To make website over-optimization easier to spot, I have compiled a list of criteria to start your investigation if you suspect penalties following algorithm updates from the last two years or so. (May I recommend the Panguin Tool from Barracuda Digital to connect to your Google Analytics and correlate the algorithm update dates to your drops in traffic? It can be very helpful to determine which updates you're up against.) The Google algorithms continue to be riddles shrouded in plenty of mystery and the search engines are hardly forthcoming with the details. (Although, as of August 9, Google is now helping identify the cause of manual penalties in Google Webmaster Tools - an extremely helpful new feature that can finally help guide marketers to improving their websites in a more targeted way!) Aside from being proprietary information that allows Google to stay the top search engine in the world, revealing the specifics of the algorithm would make it simple to game the algorithm, thus defeating the purpose of having an allegedly objective ranking algorithm in the first place. The list of over-optimization criteria below is not a hard and fast, proven checklist you can follow to magically fix your website's algorithm problems overnight. These are just the areas I check right away for potential over-optimization when assessing websites for opportunities to improve, and hopefully regain standings in the search engines if penalties seem to have occurred. Do you have something you check for that I haven't listed here? Add it in the comments!
- Does your content suck? This goes beyond whether you are subjectively considered a decent writer, though I believe that could also potentially come into play here. There are a number of potential reasons your content may suck in the eyes of both visitors and search engines. Scraped. Aggregated. Short and thin. Unoriginal. Duplicate. If you're using any of these techniques to hammer out weak content as quickly as possible, you could be over-optimizing your website. If your focus is on search engine optimization before producing killer content that wows readers and makes them want to share it, you are at risk over over-optimizing poor content that won't serve you in the end. What a waste of time and resources! Ask yourself, which is better? Spending 10 minutes on 25 pieces of quick, shallow content that do nothing for you, or spending 250 minutes on 1 incredible piece of content that answers all the users questions, has breadth and depth on a topic, can't be found elsewhere with a quick search, is highly linkable and shareable, and positions you as an expert on the topic? You know the answer.
- Are you keyword stuffing? Do you have short body content that's packed to the rafters with the page's target key terms? "Keyword density" used to be something SEOs aimed for. It was ideal to mention your exact keyword X number of times for every 100 words of body content in order to rank for that keyword. If the content reads unnaturally or is even difficult to read for all the keyword cramming going on, then you're doing it all wrong. When it comes to the ideal length of your content, I will defer to a funny saying one of my journalism professors told me: "Like a woman's skirt, the article should be long enough to cover everything but short enough to keep it interesting." There is no set number of keyword mentions vs. set amounts of content that will yield the perfect level of optimization for any given page; there are a ton of other variables like subject matter, breadth and depth of the topic, quality of content, competitiveness of the key term, availability of peripheral key terms, etc. that will all come into play. Well-optimized content just isn't as simple as keyword density.
- Is your internal linking structure unnatural and excessive? Interlinking between relevant pages is an important and useful practice to help guide your readers through the content on your website to pages that might interest them. However, many people make a few select interlinking mistakes: excessively linking to a single page on the site with identical, exact match anchor text every time, linking to a page multiple times on the same page, or only linking to the home page with exact match anchor text instead of deeper internal pages that may be a better fit for the user. If the anchor text of your links is near identical every time, it's obviously a ploy to persuade the search engine to rank the page for that key term. Be more strategic about your interlinking strategy, and vary the anchor text so that it's relevant to the target page, but not deceptive or attempting to manipulate search engines.
- Are you doing garbage old school black hat stuff like keyword stuffing, cloaking, hidden text or doorway pages? Doorway pages and cloaking may be serving one page to search engines and an entirely separate page, image or Flash video to users. Cloaking can also mean inserting the keywords only when search engine bots look at the page but not visitors. Hidden text and links within the content is intended to manipulate the search engines (like using the same text for text and backgrounds, putting text behind images, using CSS to position the hidden text off-screen so it's not visible to users). These tactics used to work surprisingly well for websites, and sometimes they still may work briefly before search engines catch on. But search engines continue to evolve and adapt to defend the integrity of their search engine results against these manipulative techniques, and they're sophisticated enough to spot these old school black hat techniques now. Given that these algorithm updates are rolling out continuously, the benefits do not outweigh the risks or ethical questionability. Not worth it.
- Do you have tons and tons of ads, possibly even more than you have valuable content on the page? If you have numerous blocks of AdSense, banner ads and other paid advertising on a page that overshadow the legitimate, unique content the page is supposed to provide for searchers, you may be at risk. This is over-optimization to generate advertising revenue. If visiting your website is like getting blasted in the face with a fire hose of ads to get to the tiny piece of content visitors clicked through to begin with, you are robbing visitors of a quality experience with your site and brand, and when they bounce immediately and navigate elsewhere, that will be taken into account. (It could also be argued that you are diminishing the value of clicks away from your site because people often click things they didn't know were ads and quickly bounce when they realize it, thus lowering your value as a publisher displaying ads, and in turn your revenue-per-click.) Your website shouldn't look like the Las Vegas Strip or Times Square on acid if your primary purpose is to deliver content to readers.
- Does your linkbuilding strategy suck? This can be a very broad topic, so allow me to break it down a little further. Are you using your own web properties to create a chain of links to one another? Are you buying tons of garbage links from spammy linkbuilding services, or participating in press release or article spinning to have cheap, weak, questionably written articles with exact match backlinks published on poor quality scraper sites? Are you only targeting one type of website or page when requesting links so all your backlinks come from a small pool of nearly identical sites? Are you only targeting the home page of your site instead of deeper, internal, more relevant pages? Any of these (and more I haven't listed here) done with significant volume or extremely concentrated timing can contribute to off-page website over-optimization. Use your brain and your marketers' intuition. If it looks unnatural, it probably is unnatural.
- Do your backlinks come almost entirely from easy-to-manipulate places like open forums, blog comments or Wikipedia citations? Remember, search engines look at backlinks essentially as "votes" for your pages because of their quality and value. If the links are too easy to place yourself and you're doing it en masse, it looks like you are stuffing the ballot with a bunch of votes for yourself. These techniques to be present in relevant spaces on the web and market yourself are fine strategically and in moderation. But by no means should these be paid for or automated, with spikes of thousands of links at once. Also, the value these links pass is minimal if any at all, so it should never be the core of your linkbuilding strategy. Work on earning backlinks the hard way, and it will pay off in the end.
- Do all your backlinks have the exact same anchor text? Anchor text helps search engines associate keywords with the destination page, and thus can help that page rank for related key terms. Exact keyword usage every single time used to be the target when building links, but it has become clear that large quantities of these exact match anchor text links is an unnatural linking pattern, and it's easy to manipulate. Nowadays it is more acceptable to have variations of target key terms, phrases like "Click Here," and even just the hyperlinked URL. You know good and well that is more likely to be a natural linking pattern. (Virante has a pretty neat tool called the Anchor Text Over Optimization Report where you can analyze the number of root directories linking to any page on your site and the proportion of your backlinks that have those identical instances of anchor text. It's not perfect, but it can give you a quick glance at potential threats.)
You may or may not receive link warnings for Google Webmaster Tools if you are employing some of the strategies below. It's up to you to monitor your inbound links and have a legitimate linkbuilding strategy, not the search engines to alert you when you're being held accountable for it.
The Bottom Line
Truthfully, "over-optimization" is a bit of a misnomer. Once you cross that threshold between optimal and being "over-optimized," you've really entered the realm of black hat, spam and other poor techniques that can hardly be called "optimization" since they stray far beyond optimal. We shouldn't be surprised that these too-good-to-be-true techniques no longer work to get web pages ranking. These techniques make up a narrow, short-term view that rarely pays off in the long run.
Website owners and SEOs must realize that search engine engineers are not stupid, but they are enormously capable and sophisticated, and the algorithms are increasingly adept at detecting shoddy work. (Come on, Google is one of the largest multi-national corporations in the world. Believe me, their engineers have the resources and intelligence to identify and devalue websites that think they're very clever employing these weak techniques. And they can afford to adapt.)
At the end of the day, the one key question you should ask yourself is, "Am I trying to appeal to inanimate bots and crawlers before real human customers?" If the answer is yes, I have one core recommendation: Stop it. You know what pleases the bots and crawlers? Pleasing people. You must make an immediate commitment to doing the right thing by your audience, and follow through on that commitment. Shift the way you think from "What's going to work right this second to get this page ranking number one for this keyword?" to the more stable and reliable, "What's the right thing to do for my customers?"