Black Hat Tactics - SEO for 2016: The Complete Do-It-Yourself SEO Guide (2015)

SEO for 2016: The Complete Do-It-Yourself SEO Guide (2015)

Chapter 9. Black Hat Tactics

I have so many instances and references where I myself have made mistakes or where others have fallen before me and I have learned from their mistakes. In this chapter we will talk about many of what are known as black hat tactics which you should avoid:

· Flash

· Java Script Menus

· Using Dynamic URL’s

· Dynamic Code On Pages

· Bad XML Site Maps

· Abnormal Keyword Placement

· SEO Spam

· Doorway Pages

· Meta Jacking

· Page Cloaking

· IP Delivery/Page Cloaking

· Link Farms

· Spamblogs

· Page Highjacking

· Link Bombing

· What to Do If You Have Been Banned

· Problem Pages and Work-Arounds

· Validating Your HTML

Flash

When used correctly, it can enhance a visitor’s experience, unless you’re trying to get mobile devices to be compatible. The non-mobile side of your website shouldn’t be built entirely in Flash, nor should your site navigation be done only in Flash. Search engines have claimed for a couple years now that they’re better at crawling Flash, but it’s still not a substitute for good, crawlable site menus and content.

Using Dynamic URL’s

A “dynamic URL” is most simply defined as one that has a “?” in it, like

HTTP://WWW.YOURDOMAIN.COM/PAGE.SRC?ID=3456

That’s a very simple dynamic URL and today’s search engines have no trouble crawling something like that. But when dynamic URLs get longer and more complicated, search engines may be less likely to crawl them (for a variety of reasons, one of which is that studies show searchers prefer short URLs).

So, if your URLs look anything like this, you may have crawlability problems:

HTTP://WWW.YOURDOMAIN.COM/PAGE.SRC?ID=3456&XID=9765487&CID=333394445&VID=34521456&SESSION=875694875

Google’s webmaster help page says it well: “…be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”

Dynamic Code on Pages

Code that is held in a database and pages that display the output dynamically and on pages that deliver unique output will have issues being indexed by the search engines. Some pages also have what is termed as “code bloat.”

Code bloat is situations where the code required to render your page is dramatically more substantial than the actual content of the page. In many cases, this is not something you’ll need to worry about—search engines have gotten better at dealing with pages that have heavy code and little content.

Robots.txt Blocking

First, you are not required to have a robots.txt file on your website; millions of websites are doing just fine without one. But if you use one (perhaps because you want to make sure your Admin or Members-only pages aren’t crawled), be careful not to completely block spiders from your entire website. It’s easy to do with just a simple line of code.

In no circumstances should your robots.txt file have something like this:

User-agent: *
Disallow: /

That code will block all bots, crawlers, and spiders from accessing your website. If you ever have questions about using a robots.txt file, visit robotstxt.org.

Bad XML Site Map

An XML sitemap lets you give a list of URLs to search engines for possible crawling and indexing. They’re not a replacement for correct on-site navigation and not a cure-all for situations where your website is difficult to crawl.

If implemented properly, an XML sitemap can help search engines become aware of content on your site that they may have missed. But, if implemented incorrectly, the XML sitemap might actually deter bots, crawlers, and spiders from crawling.

If you’re curious, I’ve only once recommended that a client use XML sitemaps, and that was a website with upwards of 15 million pages. If you need to get a professionally created XML sitemap, you can have one created at www.smcreator.com.

Abnormal Keyword Delivery

The reason I am discussing this first is because it is the most common occurrence. I read all the time on SEO blogs that a person got his website on the first page of Google by placing his keywords a hundred times at the bottom of his website landing page in text that was the same color as the background of his website. The person is so excited that he discounts and ignores all the professionals who comment that he shouldn’t do this. Sometime the blog creator or commenter who has his site on the first page of Google even bad mouths the SEO professionals trying to help him.

I come back and read all the time how just a few short months later, the same guy that was so excited that his website was on the first page of Google has written, “You guys were right. Google banned my URL for doing this.” It is easy to temporarily be the most relevant site on Google the wrong way. But Google and the other major search engines have ways of figuring it out quite quickly. And once you are banned, you better call in the pros to help you. There is a section dedicated to this as the end of this chapter.

SEO Spam

SEO spam is the SEO version of email spam. Email spam pops up in your inbox where it’s least wanted and those who are sending it believe that the law of averages is on their side. They think if you send out enough messages, eventually someone will respond.

SEO spam uses the same principle, except SEO spam fills the search engine results pages on search engines with results that have little or no value to the searcher. This can get you quickly banned. Imagine if you went to Google and every time you did a search you got results that were for something totally different than you searched for, or if the top 10 spaces of the search results were filled with the same company. If that happened you would switch to Bing in a heartbeat to get better results. Right?

Google and the other major search engines don’t want this to happen. So if you do something that a search engine sees as spamming, your search rankings will be penalized. It’s now even more likely that you will be removed from search rankings entirely. If Google bans your URL it’s as if you website has been removed from the internet. It is almost invisible.

A term for some SEO spam is called black hat SEO. Black hat SEO refers to the use of aggressive SEO strategies, techniques and tactics that focus only on search engines and not a human audience. Some examples of black hat SEO techniques include keyword stuffing, link farms, invisible website text and doorway pages, which we will learn about later in this chapter.

Black Hat SEO is more frequently used by those who are looking for a quick financial return rather than a long-term investment on their website.

NOTE: Black Hat SEO will most likely result in your URL being banned from major search engines. However, since the focus is usually on quick high return business models, most experts who use Black Hat SEO tactics consider being banned from search engines a somewhat irrelevant risk. Black Hat SEO may also be referred to as Unethical SEO or just spamdexing.

To make things a little more perplexing, search engines change their definitions of spam regularly. What works and is acceptable today may well be classified as spam tomorrow. This can have a profound effect on your rankings. One day you may be ranked high and on page one of a search and the next you may find that you’re on page 8 of the results all because of the links you maintain.

The easiest way to monitor search engine changes is to keep up with what’s happening in SEO on the Google

Webmaster Central Blog (http://googlewebmastercentral.blogspot.com/)

The general rule is, if you’re doing something on your website that you have to worry might get you banned from major search engines, you probably shouldn’t be doing it. If you read anywhere that SEO spam techniques are okay, or that you won’t get caught because search engines don’t pay attention, ignore this advice at all costs. The penalties differ according to the search engine, but if you are caught spamming even once, most search engines will delist you from search results immediately.

I have seen it a hundred times and have had it happen to me on accident when I went overboard experimenting. Yes, even the experts make a mistake or two. But we learn from them.

Here’s what is definitely considered spam:

· Trying to make your site appear more relevant to the search engines by embedding hidden keywords in your website.

· Artificially generating links to your website from unrelated sites for the purpose of increasing your ranking based on link analysis.

· Artificially generating traffic to your website so that it appears more popular than it really is.

· Submitting your website repeatedly for inclusion in the rankings.

NOTE: You should submit your site once and then wait at least six weeks before submitting it again.

Creating Doorway Pages

Doorway pages are created to do well for particular phrases. They are also known as portal pages, jump pages, bridges, gateway pages, entry pages, and by other names as well. Search engines have developed ways to easily identify these pages. They are primarily to make a page seem more relevant for search engines, and not for human beings.

You should always have your pages designed for human eyes and not just for a search engine. There are various ways to deliver doorway pages. The low-tech way is to create and submit a web page that is targeted toward a particular phrase or keyword.

These pages tend to be very generic. It's easy for people to copy them, make minor changes, and submit the revised page. Sometimes these are so similar that the search consider these duplicates and automatically exclude them from their listings.

Another problem is that users sometimes arrive at the doorway page. Say a real person searched for "welding supplies" and the doorway page appears. They click through, but that page probably lacks any detail about the welding supplies that you sell. To get them to that content, webmasters usually propel visitors forward with a prominent "Click Here" link or with an automatic page redirect.

Some search engines no longer accept pages using any redirects (sometimes referred to as fast Meta refresh). This has led to some black hatters doing a bait-and-switch, or “code-swapping,” on the search engines. Some black hat webmasters submit a real web page, wait for it to be indexed and then swap it with a doorway page.

The trouble is that a search engine may revisit at any time figure out what you have done.

Meta Jacking

Meta jacking is the taking of the Meta tagging from one page and placing them on another page hoping to obtain good rankings and relevance. However, simply taking Meta tags from a page will not guarantee a page will do well. In fact, sometimes resubmitting the exact page from another location does not gain the same position as the original page.

Agent Delivery

When you target a single doorway page to a single search engine this is called “agent delivery.” Each search engine reports an "agent" name, just as each browser reports a name.

Agent delivery pages are tailored pages that direct users to the actual content you want them to see. It also has the added benefit of "cloaking" your code from only the search engine you are targeting. The major search engines have gotten wise to this, though. They change the name they report specifically to help keep people honest.

IP Delivery / Page Cloaking

To avoid the agent name changing, you can also deliver pages to the search engines by allowing only the search engines IP address. If a bot, crawler, or spider visits your doorway page and reports an IP address that matches the search engine’s or the IP resolves to a certain host name, it can see the code of the website.

Link Farms

Link farms are simply pages of links that are created just to artificially boost a linking strategy in an effort to speed the appearance of the website in the top search ranking positions. Typically you pay money to join them or buy software that allows you to mass send your links.

Spamblogs

These are software or machine-generated blogs which have only one purpose – increase search engine rankings.

Page Highjacking

Page hijacking occurs when very popular webpage coding is stolen and used to represent your website to the search engines. When users perform a search and see your webpage in the search results, they click through the link only to be taken to your actual page.

Link Bombing

One anchor text tactic to avoid is link bombing. Link bombing refers to the methods used by black hat SEOs to artificially inflate their website ranking by connecting an unrelated keyword to a specific website. For link bombing to work, more than one website designer must be willing to participate in a link exchange.

What to Do If You Have Been Banned

If you have been banned from the major search engines, you may find it the worst experience of your life, especially since the internet is now the official yellow pages for most and being banned from the major search engines is like locking your business doors and taking the phone off the hook.

Most of the time you will know what was done that got you banned. In this case, it may require explaining to the search engine the tactic you employed, why you employed it, and how you fixed it. Google, for instance, allows you to send in an explanation by going to the

Look Up Your Links and Disavow the Bad Ones

The Disavow links tool in the Google Webmaster Tools is there to save you when you get your website banned. There are other reasons you might get banned but most of the time it is the links that are directed at your website.

You want to Disavow Links tool should be submitted with a list of URLs which are not related to your website, contain purchased links, or links that are of bad quality. After that you need to send Google a Reconsideration Request.

Google Reconsideration Request

In Google’s Webmaster Tools which is found at the link: https://www.google.com/webmasters/tools you can find your website which has been verified and click on the option: “Site Reconsideration” link on the left-hand side. This will take you to the web page which outlines written and video instructions to resubmit your website.

It could take a couple of months to a year to be reindexed into the search engine. In some cases it is quicker and easier to ditch the old URL and start a brand new one. Then you have to work your way back to the top of the rankings again. Just make sure that you have fixed or stopped doing whatever was done to get your URL banned in the first place.

Outgoing Link Issues

Linking out is crucial for blogs and even static websites. Many webmasters stopped linking out in order to hoard PageRank and not allow others in their industry to get a better rank. Google engineers absolutely discourage this practice. In fact, so much so that Google has started to really want to see industry related links that are two-way links on not only the homepage but the deep pages as well. Linking out can be risky though. Here are some things to look out for.

Broken links

Too many broken links on a page raise a red flag in the Google algorithm, so you need to make sure your links going out, are always good. This might not be a penalty in the strictest sense, but you drop suddenly in rankings once more than one or two links are broken on the same page.

Website in a Bad Neighbourhood

Bad neighborhoods are server IP addresses that host websites Google finds bad. Such as porn sites, hate sites, some political sites, spammers etc. Most of these links happen more naturally as part of link decay. Sites disappear and domain grabbers buy them to display ad loaded “domain parking” pages. These pages can be bad as well.

Too Many Outbound Links Can Be Bad, As Well As None At All

This figure changes, but currently our testing shows, Google is fine with up to 76 outbound links on a page but degrades the pages ranking after this number. A website that has more outgoing links than content itself can lose its search visibility. This might not happen overnight like the typical penalty you’d expect, but it can amount to one in its effects. Also, pages without a single outbound link are now being termed “dead-ends”. In this case there are no outbound links and Google has been placing a severe penalty on these pages.

Hidden Links

This may sound crazy. How do you hide a link? Well in add-ons, in CSS coding, etc. I am adding this section because of a counter added to a WordPress website that contained links in it to other websites. The link was not only hidden; but was what you would consider a spammy link. In fact we couldn’t figure out the source of the problem until an AccuQuality.com report we ran on the website revealed there were over 180 outbound links and the reason Google was penalizing the website.

Content

Google consistently stresses that “content is king”, but it also can mean trouble. If there is no king in your kingdom, or the king is dressed in rags, or you borrowed you king from somewhere else you look bad to Google.

Here are some content issues you should look out for:

· Duplicate content – duplicate content on your own site or even elsewhere can result in a significant ranking drop. While Google does not consider this a penalty, most webmasters who experience the problem do.

· Low quality content – Google’s high quality update dubbed Panda and it later updates, focuses on low quality content. Shallow, keyword-rich content on some pages can make your whole site drop in Google.

· Scraped content – Scraped content is text taken from other sites and displayed on yours, is a surefire way to get deranked very quickly.

· Unlegible content – Content that is written in broken English or misspelled can hurt you badly. So much so that AccuQuality.com now spells checks your entire website in their report. As a general rule, your content needs sound good to the human visitor as much as to the search engines crawler who is visiting.

Ads

Google is not really a search engine but an advertising company as almost all revenue of the Google Corporation stem from ads displayed in the search results themselves and on third party sites. Nonetheless, the pressure on Google has grown over the years to tackle the problem of so called MFA (Made for Adsense) sites that pollute the Google index. With Google “Panda” the search giant finally did tackle this issue.

So this means there are some new rules to follow:

· Too many ads (low content to ads ratio) – ever since Google “Panda” has been the talk of the town, most pundits have pointed out that a too high number of ads, especially Google AdSense ads, may lead to a penalty.

· Affiliate sites with no value – Google always explained that affiliates are not an issue, but only as long as they offer some good content and additional value beyond the actual affiliate offer. Be sure to add different content, products, or services or you will face a penalty sooner or later.

Google Calling You Out

The issue of so-called “SEO outing” has been a hot one since 2011 and continuing in to 2016. The numerous high profile websites have been ousted and along with them their SEO teams or companies for doing bad SEO tricks. Many SEO practitioners argue on moral grounds that outing is a despicable practice. They might be right, but as long as there is nothing to out you, fare best. So you’d better manage your reputation online and from time to time check what the SEO team does.

NYT and WSJ

High profile old media outlets like the NYT (New York Times) and the WSJ (Wall Street Journal) like to scandalize SEO, so if you get a call from a journalist you’d better not brag about your great SEO tactics. Google, in most cases, reacts to high profile outings aka bad press.

Third party trust metrics like Blekko, WOT, McAfee Siteadvisor

If you don’t show up in Blekko, because you are banned there, and when sites like WOT and SiteAdvisor lists your site as deceptive or dangerous, this might mean you are heading towards a Google penalty. Google does not use these sites’ data but has other means to screen the Web for the same issues.

Making Google look stupid

You do not need an NYT article, a SEO blogger or Google employee to get penalized for a bad rep. Publicly showing off your black hat SEO successes makes you vulnerable to the “making Google look stupid” penalty. Leading SEO specialists agree that from a certain point on, Google can’t keep quiet about it and will penalize you in order to keep its face.

Technical issues

Not every sudden drop in rankings and traffic is a penalty; some are stupidity or gross negligence. You can shoot yourself in the foot by messing with some technical aspects of web development.

Robots.txt

The robots.txt is not really needed to improve SEO. It can break a lot of things though. Just recently I blocked one of my blogs from being indexed by Google. Of course I suspected a penalty at first but then checked Google Webmaster Tools to find out I made the mistake.

No follow

I’ve seen leading blogs barred from the Google index because they activated the WordPress privacy mode. It simply meant that all of the blog was set to noindex, nofollow which equals blocking it in the robots.txt.

Duplicate titles and descriptions

When your site uses the same or a very similar page title and description for every single page, it’s no wonder most of them won’t show up in search results. This isn’t a penalty either. It’s just logical.

Non-crawlable links in JavaScript

There are still JavaScript site menus out there that can’t get crawled by Google. Always check whether your menu uses real HTML links with “<a href=”">” in it. Or at least the whole URL must show up.

Neither a penalty Or Your Fault

In some cases a loss of rankings or search traffic has nothing to do with you or your site. Something else changed instead, and that’s why you get outranked all of a sudden.

Algorithm changes

Google changes and refines its algorithm all the time. Major changes are called updates, and sometimes mean dramatic shifts in search results. Just search for “Google Panda”. The only thing you can do then is to find out what changed and why your site does not match the new ranking factors.

Competition got better

A common “problem” is also that your competition does more SEO work than you do and one day they outrank you. A ranking change from #1 to #2 on Google can mean a traffic loss of 60 to 80%.

Current events

Sometimes breaking news may push your site down. Google News results get displayed on top, and for less competitive phrases news media start to rank in regular results as well. Most of these ranking changes will vanish after a few days.