Technical SEO - 500 SEO Tips: Essential Strategies To Bulldoze Through Google's Rankings, Increase Traffic and Go Viral (2015)

500 SEO Tips: Essential Strategies To Bulldoze Through Google's Rankings, Increase Traffic and Go Viral

Chapter 6: Technical SEO

Google need to be able to read your site and index it without getting caught up. If your website is not optimized for crawlers (used by Google to scan your website), you will find that Google cannot index your content. If your website provides a poor user experience, Google may not rank you as high as you hope.

All your SEO can go down the drain should your technical aspects let you down.

This chapter will help you get a handle of the technicalities you need to get indexed and ranking well. You’ll also learn to make sure that Google can read, index and rank your site with ease.

196. Make Sure You Join Google Webmaster Tools

For spotting technical errors and for getting a constant status update on how Google crawled your site, and any issues it found - sign up your website to Google Webmaster Tools. It gives you practically everything you need to make sure your website is working very well technically and that Google is indexing and crawling your website correctly.

197. Join Bing Webmaster Tools

Most webmasters focus on Google alone and don't really put as much effort into other search engines. However, depending on your demographics (an older audience), you may still need to use Bing. Also, Bing still gets a hefty amount of the world's daily searches by occupying 2.5% of the world's daily searches. It's still worth optimizing for regardless, and could still bring you hundreds of readers a day should you do it right.

198. Understand The Crawling and Indexing Process

When your website is live, Googlebots will head over to your website’s pages. They will gather information such as the keywords used, length of content and linking patterns. They basically inspect each and every page. If all goes to check, they will index your website in Google’s index. Soon after, your website will be ranked for its targeted keywords.

199. Get Indexed Before Anything Else

If you're starting a website, you'll find that it can take up to three months for the Googlebot to arrive on the scene and crawl your site. Make sure that your website is indexed by following the next few tips.

200. Sign Up For A Google Webmasters Account – It can take up to a month until the Googlebots get to your site.

There is a shortcut to getting indexed quickly in the space of about a week. You need to join Google Webmaster Tools. This is where you can get crawled by the Googlebots and find out if your pages have been indexed. There are a repository of other tools too, which ease the workload of maintaining a healthy, Google-friendly website.

Google Webmasters contains everything you need to make sure your website is fully functioning technically and fulfils all the requirements for crawling and indexing. You’ll get full reports on your website including indexing reports, search traffic, keywords sending traffic, linking patterns, updates and more. Google may even send messages to your site on unusual linking patterns or on sharp uptakes in traffic. (www.google.com/webmasters)

201. Submit URL to Google

Google allow webmasters to submit their URL for inclusion on Google's index. Once submitted, Googlebots will come to your site and crawl it. If all goes well (working technically, good content and no malicious spam), your site will be indexed. (https://www.google.com/webmasters/tools/submit-url)

202. Check For Crawl Hazards

These are hazards that block Google’s spiderbots from crawling and indexing your webpage. Should you have any of these attributes on your pages, they may keep crawlers away and this leads to lower rankings in the long term. Google have said that poor navigability in a website signals no authority or expertise in the area. A poor website probably means a poor user experience.

Avoid the following pitfalls by making sure they aren’t included on your website.

203. Flash Can't Be Read By The Googlebot

Google can’t read any Flash based webpages. They have become less and less common but are still found. Never use Flash for any part of your webpages because Google cannot read, understand or crawl Flash code. If you built a page entirely on Flash, you might as well have built a white, empty page for all Google cares. To you, it looks like a beautifully designed animated page but to Google, it’s a confusing mess that the spiderbots will just label “empty”.

204. Password/Form Protected Pages Don't Get Indexed

If a page can only be accessed by filling out a form or entering password, spiderbots cannot access the page. This is an advantage for the majority of pages since you don’t want password protected pages ranking on Google anyway. However, just in case there are any protected pages you want indexed, take out the form or link to the page.

205. Javascript Links Are Dodgy

If you’ve embedded any links on your page in Javascript, Google may not crawl or give any notice of these links. They may not even pass PageRank to other pages. Use HTML instead.

206. See How The Googlebots View Your Page

Think your website looks awesome? Wait until you see what the Googlebots actually see. They don't take CSS or Javascript into consideration when crawling your site, since all they see is just the bare HTML elements.

Head over to www.seo-browser.com and input one of your website's URLs. You can see if all your desired elements appear (such as text and images) or if search engines don't see anything at all. It's a real detector that will show you what Google truly sees when it crawls your site. You just might be surprised.

207. Remove Ugly URLs – There’s nothing worse than an ugly URL to ruin a Googlebot's day. They cannot read these URL’s or understand what they mean. If keywords are found in a URL, they signify to Google what you are trying to rank for and that your content is relevant to the keywords. Take a look at the following examples of an ugly URL and a friendly URL. For example:

Ugly URL: www.website.com/56shvj3

Friendly URL: www.website.com/technical/googlebot-basics

From a crawler’s point of view, the ugly URL looks extremely dodgy and very irrelevant. They cannot understand what the URL means, what keywords it is targeting and what it is about.

The friendly URL on the other hand, is understandable and easy to follow. You’d want to read this, wouldn’t you?

Check your pages for any ugly URL’s and try to keep them as clean and clutter-free as possible.

208. Merge Multiple URL’s Into One

Take a look at the URL’s below. They all refer to the homepage of a website. However, notice that every URL is different. Note that whilst they are all pointing to essentially the same page, search engines view them as five separate pages.

1. http://mydomain.com/

2. http://www.mydomain.com/

3. http://www.mydomain.com/default.asp

4. http://mydomain.com/default.asp

5. http://Mydomain.com/Default.asp

This means that Google will view all the pages as duplicates, resulting in a rather unfair slap from Google Panda.

Firstly, check if these pages actually exist. Not all websites have this problem so don’t get caught up in a fictional obstacle. Type them into your browser and see if your homepage loads up. If more than one of the links appears, you need to set your default homepage! Simply follow the steps below.

You can solve this mishap by defining a “canonical” link tag in your HTML code. This tag basically tells Google what your default/desired homepage URL is and the URL that you want to load when a searcher clicks on it.

This should clear up any confusion that search engines would have with the URL’s. is the code you need to type in to make the fix. Place it directly after the <head/> tag.

<link rel="canonical" href="http://example.com" />

209. Watch Out For "Printer Friendly" Pages, As They're Duplicates

You don't want duplicate pages on your site. Watch out for any printer-friendly pages that you have on your site because if you update the normal page, you'll have to update the printer friendly pages too. Also, Google only indexes one page of the same content so if your printer friendly page is considered more relevant than the normal page, your visitors will be shown to your no CSS, no Javscript, completely plain printer page. Not good.

Either get rid of printer friendly pages altogether or get them de-indexed by following the tip below.

210. How To Create Printer Friendly Pages By Getting Them De-indexed

If you want to keep your printer friendly pages, you can get them de-indexed by typing the following code into the <head> tag of your page.

< meta name="ROBOTS" content="NOINDEX">

Once the Googlebots arrive on your site, they'll read the code, drop the page and move on to the next page.

Add the above code to any pages you don't want indexed by Google.

211. Create A Robots.txt File

If you want to block Google from crawling and indexing certain pages (log in or member only pages), you need to create a robots.txt file. This file blocks the Googlebots from crawling that page. You should also create a robots.txt file to block malicious spambots from scanning your website for private data. The best way to create a robots.txt file without coding, is to head over to this link: www.yellowpipe.com/yis/tools/robots.txt

It’s the best tool for the job since it blocks over 130 spambots to your website also. Spambots are well-known to scrape email addresses, phone numbers and other private information from websites, so prevention is the key to making sure they don't do damage.

212. Check Your Robots.txt File

Getting your robots.txt file messed up (even one tiny mistake), can prevent crawlers from accessing your site entirely. This tool checks if your robots.txt is functioning correctly and can be read by crawlers (www.frobee.com/robots-txt-check). It's always worth a check, right?

213. Create An XML Sitemap

Sitemaps give crawlers more accessibility, and are one darn useful resource to have.

XML is one difficult language to code in, let me tell you! That’s why there are tons of websites that will generate sitemaps for your website. You can use the sitemap creator in Google Webmaster Tools.

Or, head over to www.xml-sitemaps.com to generate your sitemap in minutes.

214. Check Your Sitemap

Again, just like the robotst.txt file, you don't want your sitemap getting botched up. It would be fatal.

Make sure your sitemap is working well and is fully readable. You don’t want the Googlebots to get bogged down in your sitemap. Use the sitemap checker in Webmaster Tools or the one here: www.sitemapinspector.com/

Simple Ways To Get Your Website Loading Faster

215. You Need A Fast (Or Fairly Fast) Website, Fact

Fast is the new cool. Pages that load at lightning speed have a HUGE advantage over snail speed pages. If you have fast Internet, you may still come across a page that takes minutes to load (or may never load at all). I have come across some websites that have about five HD film-length videos, tons of CSS style sheets, JavaScript animations, a complex website design and about thirty hi-res, super HD images.

According to Aberdeen Group, a 1 second delay in your page load time results in 11% fewer page views, a 7% loss in conversion and a 16% decrease in customer satisfaction.

For Amazon, page load time is an essential part of their business. For every 100 milliseconds improvement in page load time, they report a 1% increase in revenue.

Sure, the more multimedia your page has – well, the better it is. But going overboard often means that many people with medium or slow internet speed will probably never land on your website. Google take PageSpeed very seriously. If you're website is lagging behind, it could face extreme consequences such as lower rankings and it will be a real challenge to get to page one or two, even if you do everything else that's required.

216. Use Google PageSpeed Insights For More Personalized Tips

Luckily, there are a few tips that you can employ to become like lightning. Google find PageSpeed so important that they developed a tool for the job.

Head over to https://developers.google.com/speed/pagespeed/insights

and input either your website’s home page link or the piece of content you want ranking. You will get a score from 1 to 100 with 1 being snail speed and 100 being thunderbolt fast. Striking a balance within the two is key, try aiming for 60-80. You can achieve this by following the personalized tips (tailored to your page) that they have.

217. Get Even More Speed Tips With Pingdom

Here's yet another site that you can effectively use to get more tips for speeding up your site. Whilst I can list out the most common issues that slow down websites, there are always a few issues that aren't so obvious. You might get some great ideas on speeding up your site from this tool. (http://tools.pingdom.com/fpt/)

218. Web Page Test

You might as well get even more personalized tips for improving your site speed, right? Here is the third and final website which will check your website's page load time and offer suggestions. It's fast, easy to use and I love it. (www.webpagetest.org)

219. Supercharge Your Site's Speed and Security With CloudFlare

This is probably one of the greatest developments in internet history. CloudFlare makes your site load twice as fast, blocks attackers and uses a Content Delivery Network to allow anyone across the world to access your website in a quicker time. The best part? It's free.

If you want to speed up your website even more, but don't want to pay for extremely expensive hosting, you should give CloudFlare a try. It takes five minutes to set up (literally) and doesn't require any technical expertise at all. I've tried it before and I have found a huge benefit from it. (www.cloudflare.com)

220. Upgrade Your Hosting

If you're paying five to ten dollars for your hosting, don't expect to break any speed records. If you're website is getting over ten thousand visitors a month, it's time to start considering more expensive hosting options. Even an upgrade to twenty dollars a month is enough to get a faster page load time. If you've implemented all the PageSpeed strategies mentioned above, and your website is still taking over 7 seconds to load, you may need to seriously think about upgrading your hosting.

221. Get Rid of HTML "Junk" To Speed Up Time

Check your HTML files and make sure that you get rid of comments, white spaces and anything that doesn't directly impact on your HTML code. This will speed up your page load time a little, and since you don't really need comments and white spaces, you can afford to take them out. You can make a saving of up to 28% on your page load time by "minifying" your CSS and HTML files.

222. Check For Broken Links

Pages come and go, but links are forever.

That's why you need to check for broken links (links that point to pages which no longer exist) on your website. They're a real pain in the backside, from both a UX and technical point of view.

Luckily, you can easily find broken links with Xenu's Link Sleuth (http://bit.ly/1cU7Fmx), for Windows. It needs to be downloaded to your computer and set up properly.

If you're using a Mac, try Integrity instead (www.peacockmedia.co.uk/integrity). Similarly to Xenu's Link Sleuth, it needs to be downloaded to your computer.

223. Compress Your CSS Files

Once you have taken out unnecessary comments out of your HTML and CSS files, it's time to "compress" the files. It takes things to the extreme. Basically, it gets rid of all the "lines" of code, and instead, compresses all the lines into one huge block of code. Yes, it may be hard to read, but it can speed up your site hugely. Compress your CSS files at www.csscompressor.com. Keep a copy of your old CSS file just in case you need to make changes to it in the future. You can always do your edits on your old CSS file (the file that wasn't compressed) and compress it again later.

Whatever you do, make sure to always have a backup copy!

224. Put CSS Files and Javascript Files At The End of Your Code

This at least allows the browser to load up the basic HTML elements and can then download the CSS and Javascript files afterward. This just makes sure that the browser doesn't get stuck on loading up a Javascript file, and risks the user leaving the page completely.

225. Compress Your Image

Compression reduces the size of an image without degrading the quality of the image to an unacceptable level. It basically converts the image to binary code and looks for patterns in the code. If it finds recurring patterns, it compresses these patterns into one pattern. Make sure that all your images are compressed. This will greatly improve your site's page load time.

226. Use Image Compression Tools

There are three image compression tools which I recommend. There are two types of image compression tools; web based tools and local tools. Web based image compression tools offer less customization than local tools, but are quicker and easier to use. Local image compression gives you more functionality and allows you to customize other

Here is a list of the best local and web-based image compression tools:

Web-based Image Compression Tools

· Yahoo! Smush.It – www.smushit.com/ysmush.it

· TinyPNG – www.tinypng.org

Local Image Compression Tools (Need To Be Downloaded)

· ImageOptim – www.imageoptim.com – For Mac only – JPEG, PNG and GIF

· Caesium – www.caesium.sourceforge.net – For Windows only – PNGs, BMP and JPG

· JStrip – www.davidcrowell.com/jstrip - For Windows only - JPEG

227. Use External Javascript Files

Javascript is a lot of code to digest. Trying to download all that Javascript code on every page can really slow things down. Use external Javascript files just like your CSS files, and link them to the HTML file. That way, if the user triggers the need for Javascript code – the file will kick into action. It's a lot better than downloading the code every time, when users mightn't even need to use it.

228. Flash Should Get The Boot

Sorry, folks but if your website is using Flash for animation, it's being slowed down completely. That, and the Googlebot can't read the code properly either. Just take out your Flash completely and get rid of it. Or else, use it in very tiny amounts when absolutely necessary.

229. Don't Use More Than One Analytics Program

Using five analytics programs? Every time you need to embed tracking code into your website, you are slowing it down. You really don’t need more than two analytics programs. Google Analytics contains pretty much everything you need and provides all the advanced functionality that other analytics programs have.

If you feel the need to use another analytics program as well as GA, you may have enough leeway to do this. However, do not exceed trying to embed more than two tracking codes because you will seriously slow down your website.

230. Watch The Number of Videos You Have Per Page

Whilst one video or even two videos don't slow down your page drastically, make sure to keep an eye on the number of videos you have per page. For example, your Home Page is essentially a landing page for potential customers. Let's say you add a video to it that contains customer testimonials. If a video will slow down your page even by a few hundred milliseconds, it's still precious time. Instead, use videos on pages where you don't rely so heavily on converting searchers into customers.

Mobile Optimization

231. Use The TouchSwipe JQuery Plug-In

Optimize for certain mobile manoeuvres such as swipe, zoom and pinch. If users can't do these seemingly simple tasks on your website, you could lose them. Google will also be checking for this when ranking your site. You can optimize your site for this by installing the TouchSwipe JQuery plug-in.

232. Get Responsive

A mobile-friendly site means that it changes the desktop site to become mobile-friendly by enlarging text, shifting the grid around to make sure all the elements fit together and making buttons larger. It uses the same content as a desktop website, so it isn't a duplicate of the desktop website.

Basically, it's the same website but it changes slightly for a mobile device. It also alters to different mobile devices such as tablets or smaller mobile devices. So the website would appear differently on an iPhone than on an iPad.

You can either make your website responsive or make a separate mobile website (when a mobile device gets to the website it is redirected to the mobile website). If you want to take Google's word for it, they recommend a responsive website. Also, it makes things easier since you don't have an entirely separate website to look after.

233. Don't Create A Separate Mobile Domain

Many businesses go mobile by creating a separate mobile domain by using the "m-dot" technique, eg: www.m.mydomain.com. I strongly advise against this, for the sole reason that trying to maintain two separate websites is a nightmare. When you publish a blog post on your desktop website, you always need to remember to publish it to the mobile site.

Any small changes you make on the desktop website will have to be done on the mobile website too. As you can see, this provides you with nothing but more work and less time for other activities.

234. Use Icons Instead of Text Buttons

When you want to include buttons in your website such as in Call To Actions and navigation bar buttons, use icons instead of text. So instead of the "Home" text button in the navigation bar, you could use an icon of a home instead.

This is very simple but is a technique that will bring user experience to an all-time high. Going mobile is all about taking out the noise and bringing out the best in your website.

235. No Zooming Should Be Required

Design your mobile friendly website in such a way that no zooming should be required to view your text or buttons. Text size should be a minimum of 16 pixels, if you want it to be easily read.

236. Don't Cluster Buttons Together

Users can make a lot of misplaced clicks when you have a lot of buttons clustered together. When they select one button, a nearby button may be clicked instead. Make sure that your buttons are spaced out evenly, making it easier for users to select the right button. Also, keep buttons as large as possible. There's nothing more infuriating for mobile UX than a bunch of tiny buttons.

237. When Going Mobile, Keep Things Simple

There's no need to get too complex on the design of your mobile website. Retain the navigation bar, sidebar and on-page content. But there's no need to include too many flashy features, because a mobile screen is small and can't hold all the space that a typical desktop can. Simplicity is also the key to a quick page load time.

238. Avoid Flash, Frames and PNG-24 Formats These nasty additions to your mobile site could result in some nasty user experiences. Since most smartphones and tablets can't play Flash or frames, it would be dumb to go including them. You can use HTML5, CSS, jQuery, JPG and GIF files for your mobile site, which all work perfectly on a mobile device.

239. Don't Use Pop Ups

Mobile phones don't have the same leverage as a desktop, so remember – no pop ups. It's common sense really, but just in case you migrated your desktop to mobile, you may have brought the popups too.

240. Don't Delay On Going Mobile, It's Urgent

Google have their April 21st mobile algorithm change, they've convinced practically every webmaster as part of their #MobileMarch campaign and more than half of all web traffic is through a mobile device. You need to go mobile, and fast. Sites without a mobile friendly site will face penalties and lower rankings as a result.

241. Take The Mobile Friendly Test

How mobile friendly is your website? Are you doing enough or is your site still lagging behind? Get these answers and more from the experts themselves, Google. Simply input your website URL and get the results! Do the test at:www.google.com/webmasters/tools/mobile-friendly/

242. Use The "Click To Call Feature"

Want someone to give you a phone call? By placing a "Click To Call" button on your site, people can ring you immediately if they have questions or want to become a customer.

This is probably the only feature that is specific to mobile devices only, and can only be used if you have a separate mobile website domain. If you use responsive site, you cannot use this feature. Also, you'll only need to use this feature if you have a local business and want to get customers for your business.

243. Make It Local, If You're A Business

Enhance the whole "local" aspect to your website for mobile devices. Include your mobile phone number, a click to call button, address and opening hours. Research has proven that over 50% of mobile searchers are to get more information about a local business or company. Make sure to optimize for this, by having all your business details ready.

Also, incorporate rich snippets such as your phone number and join Google My Business. This will give you an entire section in the SERPs if someone searches your company's name. Details such as your opening hours, phone number, map with location and review stars are all included as part of the Google My Business feature. You'll find more on this in Chapter 7.

244. Make It Easy For Someone To Share

Research has found that 72% of Facebook users use it on a mobile device. Many websites are not optimizing for this by abandoning social share buttons completely. Make sure they are either at the top or bottom of the page, or somewhere. When 72% of your potential audience are using a mobile device, there's no reason why you shouldn't be optimizing for this.

245. Minimize The Number of Share Buttons You Use

Keep your social sharing buttons to a bare minimum. You seriously don't need social sharing buttons for Digg or Stumble Upon because chances are, if someone wants to share your content – they're probably on Facebook, Twitter or Pinterest already.

Having more than six social sharing buttons really starts to slow things down big time. Also, that's not to mention the sheer ugliness all those social buttons create. Keep it to four social sharing buttons maximum, and no more than that.

246. Get All The Tech Specs With Screaming Frog

When you're checking if your website has all the necessary technical requirements (proper redirects, on-page SEO, links working and so on), you need Screaming frog. Download it at www.screamingfrog.co.uk/seo-spider/

This crawls your entire site and grabs data on pages, links, page titles, redirects, 404s, heading tags, status codes, attribute lengths, anchor text, alt text and numbers of internal back links, numbers of external links and so much more. You can also download all the data into Excel, and check on it then.