4 Common Search Engine Optimazation Practices That Do Not Work

Don't Be Part Of The Website Spam Problem.

Don't Be Part Of The Website Spam Problem.

Understanding Where Future Customers Come From is Easy With Proper SEO and Website Analytics.

For those just now dabbling in search engine optimization (SEO), it's easy to get confused. There's plenty of information—and misinformation—floating around about how search algorithms operate, and it doesn't take long before your SEO path can focus on strategies that are either dated or ineffective. At best, you get an SEO strategy that is relatively inefficient. At worst, you can end up having pages in your site, or even your entire site, blacklisted by major search engines.

To help you sift through the near-infinite advice that comes up from a search for SEO, we're going to share four of the most common SEO practices that, for various reasons, do not work—despite what you may have read or heard, or what others continue to say. Consider using these tips when writing website content or blog articles for your interior design firm, architectural website, design-build blog or landscape architect marketing initiatives.

Four Common Search Engine Optimization Practices To Avoid


Webmasters once used search engine submission forms to help optimize their sites, tagging sites and pages with various keywords. These submissions were crawled by search engines as part of their indexing process. Not surprisingly, submissions were soon full of spam. Spammy submissions focused on keywords that boosted rankings, not words that accurately reflected the content of their site. Search engine submission was eventually dropped for purely crawl-based search engines.

Submission has played a minor role since 2001. Few search engines offer or advise completion of any submissions forms. Take note: Any SEO that lauds the benefits of submission or offers search engine submission services is either grossly inept or intentionally trying to waste your marketing dollars. Either way, you should start seeking advice from someone else.


Meta tags have followed a similar fate as that of search engine submission. The meta keywords tag was formerly a critical element in your SEO. Meta keyword tags were the self-chosen keywords for which you wanted your site to rank. Just like search engine submission, meta tags proved fertile ground for spammers, and it has had little SEO impact since.

Other meta elements remain important. Your title tag, which is sometimes lumped together with other meta tags, is still an important factor in your SEO. Further, the meta description tag, while not considered in search engine algorithms, remains an important way to win the click for your site on a search engine results page (SERP). Your meta description appears below the linked blue title text and serves as a 160-character pitch for why any user should click on your link.

Additionally, the meta robots tag still plays a role in managing crawler access to your site.


At its worst, keyword stuffing is nothing more than a nonsensical combination of popular search terms. (Almost everyone has come across a keyword stuffed site, where the contents contain nothing more than words like “Real estate Oklahoma best homes real estate housing Oklahoma.” In its more subtle form, it is the slight overuse or awkward insertion of keywords under the belief that just one or two more appearances will up the keyword density to its maximum value.

There are various theories about the ideal keyword density, usually expressed in a keywords-per-total-words ratio. The problem is that, even if there is some magical number programmed into Bing or Yahoo!'s algorithm, the value of hitting that number—at the expense of developing the best, most readable content you can create—is little compared to the boost in rankings you would receive from one high-quality editorial link. It's a perfect example of the fuzzy border between SEO and only creating the best possible content.


How much of an impact do you think paid search (pay-per-click advertising campaigns, etc.) has on your site's SEO? There's only one correct answer: none.

All the main search engines put forth considerable effort to separate their organic search results from paid search. While many may dismiss the belief that search engines are unbiased between organic search and paid search clients, the engines have strong motivation not to let the lines blur. After all, while paid advertisements are their primary means of making money, the amount they can charge for those ads depends largely on their search market share. And providing crummy results to search users—by falsely elevating companies with deep pay-per-click pockets—is the quickest way to diminish their share. Just as the best advice for sites is develop great content, the best advice for search engines is to provide excellent results.

There is only one proven connection between pay-per-click advertising and improved performance in organic results. For companies that already rank high organically for a given keyword, bidding on that same keyword in pay-per-click campaigns can help increase click-through rates on the organic result. Obviously, this has nothing to do with search engine favoritism and everything to do with the traditional marketing value of repetition, brand recognition, and consumer trust.

Why These Four Practices Do Not Work

While search engine algorithms always contain some exploitable liability, however small, they are making considerable progress. With each update, they eliminate one more loophole and send spammers searching for another path. New, complex elements of algorithms like TrustRank and historical site data make it harder and harder to generate a new spammy site capable of fooling the algorithms.

In particular, Google's Panda update, which has been rolling out in several iterations over the last few years, has targeted not just obvious spam but other thin content—the type of content users generate when they're trying only to avoid being categorized as spam. Because search engines rely on market share and the accuracy and utility of their results, they will always defer to the results that most benefit the end user.

Even successful deployment of a black hat SEO technique is, at best, a short-term benefit with long-term liabilities.

More About Search Spam and Your Website

While we're at it, it's also worth spending a little bit of time to learn about why there are black hat SEO practices and how search engines have tried to ferret them out over the years. Here's an overview of search spam, how search engines find it, and how to help your site recover from any search-engine levied penalties.

Why Search Spam Will Always Be Used By The Unscrupulous

What would lead someone to intentionally deploy bad practice SEO practices? Most search-engine updates are in response to existing ways webmasters are gaming the algorithm. It's representative of the endless battle between search engines and nefarious webmasters and developers. And it shows no signs of diminishing, having increased since the 1990s.

Here's the ongoing motivation: Just a single day at the top of an SERP for “Buy Viagra” could net some $20,000 in affiliate revenue. Owning that top spot—no matter how you get there—can be an incredibly valuable piece of real estate. If, by some stroke of genius, you could retain that top spot for “Buy Viagra” during an entire calendar year, you would gross more than $7 million. Needless to say, a payday like that keeps plenty of unscrupulous people interested.

Black Hat SEO Techniques At The Page Level and Site Level

Search engines continue to become more and more efficient at identifying spammy Web practices. Presently, they identify spam on two different levels, the page (URL) level, and the site (domain) level. Understanding how search engines identify spam is important in the development of your site. You can't protect your contents until you know how it may become compromised, even unintentionally.


At the page level, search engines are looking to uncover keyword stuffing, manipulative linking, cloaking, and low-value pages. Keyword stuffing, as we've noted above, is one of the easiest practices to spot, especially in its most egregious forms. It's nothing more than the repetition of high-value search terms without regard to how they add content value to your site. Even in milder forms, studies have demonstrated that keyword stuffing has a minimal impact on your ultimate location within the SERPs, and certainly not enough value to justify compromising the quality of your content.

Additionally, keyword stuffing is one of the easiest ways for search engines to flag your page for black hat SEO techniques—it's just as obvious to search-engine crawlers as it is to your customers.

In contrast, one of the more sophisticated black hat SEO practices is manipulative linking. Manipulative linking exploits search-engine valuation of backlink structures. (Backlinks are links on other sites that point to your site.) Manipulative linking can be achieved in several ways, with the shared goal of creating a more robust backlink structure for your site, albeit an artificial one.

With reciprocal link exchanges, sites point to and from one another as part of a mutual agreement to boost their backlink structure. While this may make sense if the two companies have shared interests, its black hat version might link a plumber and a podiatrist—clearly, visitors to either site have no good reason to see links to the other. Reciprocal link exchanges are the easiest manipulative link tactic for search engines to uncover.

Paid link structures also exist. Sites with a high standing charge other companies for the right to place a link on their highly rated site. While not illegal, paid link practices are much maligned by search engines, which continue to try and uncover and devalue such sites. Like paid link sites, low-quality directories offer pay-for-placement to sites looking to boost their backlink structure. When it finds them, Google often decreases or eliminates the PageRank (i.e. backlink) benefits from these low-quality directories. Some persist, but it's only a matter of time before the sites—and those relying on them—are taken down.

Link farms and link networks create entire sites designed solely to serve as backlinks for other sites. Content on these websites is thin and spammy, intended only to serve the needs of other sites' backlink structures. Search engines can uncover many of these link farms by examining patterns in backlink structures and site registrations. Your best investment remains in the development of long-term, high-quality backlinks.

Cloaking refers to deception between what search engines see and what regular visitors see when they visit your site. In its black hat form, cloaking uses HTML to hide text from visitors that, nonetheless, is crawled and indexed by search engines, allowing your page to rank for words that are not representative of your site's contents. Not all hidden text is penalized: Some hidden text, intended to improve user experience, is bypassed by crawlers.

Uncovering low-value pages was one of the primary goals of Google's Panda update. Low-value pages, rising just above spam levels, have content that has no intent of passing along valuable information to readers. It exists instead to attract interest from robotic crawlers. These “thin” pages often contain duplicated or dynamically generated content. The sophistication of modern search-engine algorithms has given them the ability to spot many instances of such low-value content. One of the ways they achieve this is by measuring how long it takes before a user clicks the “back” button a browser. If it happens almost instantly, search engines perceive it as a bounce, and a sure sign that the content did not meet user expectations and were unworthy of a high ranking.


Not all crawler activities focus exclusively on the page level. Search engines are also capable of identifying domain-wide practices that make sites targets for search engine penalties. The three primary methods for search engines to uncover spammy site level practices are related to linking, content value, and trust.

Linking practices, when viewed at the domain level, are an aggregation of site-level practices. Search engines keep track of the links and backlinks connected to a site, not just a single page. When patterns emerge site-wide, search engines may choose to penalize an entire domain, not just a single page. Overstock.com and JC Penney are two examples of businesses that, after exploiting loopholes in search-engine algorithms, were vigorously penalized for their transgressions. (Overstock offered special deals to universities in exchange for schools placing links to Overstock on their sites; the high-value university backlinks, in turn, artificially inflated the SERP ranking for Overstock on hundreds of everyday products.)

Like linking practices, search engines can take into account the entire content value of your site. A site full of unoriginal, uninformative content can rank low on SERPs, even if traditional on- and off-page SEO techniques are well deployed. The domain-level scan for this type of content was designed by search engines to prevent carbon copies of sites like Wikipedia, which, since its content is not copyrighted, could be relentlessly duplicated without legal repercussions.

For better or for worse, search engines—especially Google—have decided to place added value on large brands. From their perspective, big brands mean added trust, which search engines translate into higher ranking on SERPs. For small businesses or new sites, this can feel more like a penalty.

Google pioneered brand preference because it argued that consumers do the same thing. It means that any black hat SEO practice you employ, intentionally or not, will have a far greater and more negative effect if you're a small company that if you're a multinational corporation. As a result, you're far more likely to have duplicate or thin content overlooked on an important corporate site than on your personal blog. However, if you're able to earn good editorial links from major news sites or universities, you, too, can enjoy the benefits of being a big brand name, even if your brand is known mostly through your backlink structure.

How To Check Your Site  For Bad SEO Practices

While “black hat” SEO tactics can poison your site's ranking, it's also important to remember that you may have dropped in the rankings because of improvements by other sites. Not every drop reflects a search engine penalty. That said, there are also ways to check your site's performance to diagnose better any fall in the rankings.

Often, your site may have dropped because of a Web development error that kept a crawler from accessing your site. Google provides webmaster tools to help you uncover any potential mistakes. Also, changes to your site may have altered the perception of your site by crawlers. After all, search engine crawlers are computer programs, not humans, and they're still prone to see a theoretically slight change as dramatic or even catastrophic. This includes both on-page content and internal link structure, among other aspects of your site.

Because search engines rely on backlinks as a primary way to gauge the authority of your site, your ranking may also fall if the sites that host your backlinks lose authority. There's not much you can do about it, but checking other sites that may share a similar backlink profile is a good way to figure out if that may be the cause of any fall in the rankings.

Another key cause of a fall in the rankings is duplicate content. This is a particularly important issue for large commerce websites that have hundreds or thousands of pages, many with similar descriptions of the company or item being sold.

How To Recover From Search Engine Penalties

Maybe you're just now realizing you've made a few mistakes. These could be the result of outdated or flat-out bad advice, or perhaps you thought you'd push the boundaries of white hat SEO and ended up in black hat territory. Regardless, it's not the end of the world. There are ways to recover, though not every attempt is successful, and you may never get a full explanation of what happened. Here are some tips to give you the best shot at recovering from a search engine penalty.

For one, you can register your site with webmaster tools through Google or Bing to create added trust and a more formal relationship between your site and the search engine. This is also a good time to remind you that, while you may think you've been penalized, there's a good chance that the drop in rankings resulted from coding errors or other accessibility problems. Check your webmaster tools again to make sure that's not the case.

Webmaster tools are also how you can send a re-inclusion or re-consideration request to the search engine. While you can submit a similar request through a public form, using the forms within webmaster tools are more likely to generate a positive response from the search engines.

When you do get in touch with the search engine, be prepared to give an entirely open and honest account of your SEO practices, white hat or black hat. This is in part because they'll value your honesty and make them more likely to re-include you, but also because they have a vested interest in learning about how you were able to game the system—the information you provide will become part of their future algorithm adjustment.

Typically, search engines deal with a considerable backlog of re-inclusion requests—they often receive hundreds or even thousands each week—so you may end up waiting for weeks or months to hear back about your re-consideration request. If you're fortunate enough to work for a large brand-name company, you may be able to get in touch with individual engineers directly at major conferences like SMX, SES, and PubCon. Many search engine engineers are present at these conferences, and a quick conversation may be well worth the price of admission.

Remember that search engines are under no obligation to include your site. If you start with a website platform that is optimized for SEO out of the box, like Squarespace, you are giving your online marketing a good foundation to build upon. They're perfectly within their rights to reject your site for any reason. It's just one more reason to avoid black hat SEO techniques and stick with what works—great content and quality design.

About Michael Conway and Means-of-Production

My firm builds Squarespace websites, Houzz profiles, and content marketing and advertising solutions for architects, interior designers, design-build contractors and landscape design firms. Our all-in-one tactics attract the right clients with exceptional architectural photography and brand messaging that sets you apart from the competition. Contact me for a free-of-charge consultation and marketing review. It takes about 40 minutes and you'll be provided a list of actionable improvements designed to solve your specific marketing problems.

Print Friendly and PDF