Must Have SEO Tools Toolkit

Must Have SEO Tools

By now your understanding of search engine optimization should be improving dramatically. Luckily, SEO is a common pursuit of most people who own websites (or at least it should be), so here are your must have SEO tools. Keep in mind that the majority of these must have SEO tools still require you to do manual work.

These are tools that many professional SEO’s use on a daily basis to research their websites, client websites, and competitors websites. We have also included several technical SEO “must have” documents that may not be considered an actual tool, but to an SEO they can be. 

Copyscape

Search engines detest duplicate content: to them it’s proof that someone is not contributing meaningful information to the Internet, and they penalize accordingly. Whether the content was originally yours or originally someone else’s, if it is appearing in more than one place, it’s not good. Protect your site by using a service like Copyscape to check your content frequently and make sure no one is stealing it, and that you aren’t accidentally duplicating someone else.

Keyword Research Tools

Keyword research has always been and will always be a crucial aspect of SEO marketing, and if you’re missing the basic tools of the trade, it will be much harder for you to reach the audience you want to reach. Although you may think you know what words to target, your attempts should really be based on the facts, provided by sites like Google. This tool shows you both what the competition is like (Paid Competition not organic – high, medium or low) as well as how many searches are being conducted monthly. For more tools and a basic starting strategy, read our keyword research guide or check out the list below.

Robots.txt File

The robots.txt file tells web crawlers – sent out by search engines (or sometimes bad guys who want to spam or harvest email addresses) – whether or not they are invited to index all or part of a site. The standard code looks like this:

User-agent: *

Disallow: /

… with the * indicating the directive applies to all robots and the / that it applies to all of the site’s content. You can modify the former to exclude specific robots, or the latter to exclude specific files or directories. The robot.txt file can be used to hide pages that are still under construction, have a high image load (since images without alt attributes are useless, explained here) or private directories.

Note that the file does not prevent the robots from crawling, it merely asks them not to. Since they can still choose to do so, a good rule of thumb is to exclude any content that is not relevant to your site. Remember, sites get indexed – and therefore found by humans – by allowing robots to do their job, so use the file minimally or forego it altogether.

Meta Robots

Thetag works similarly to the robots.txt file. By inputting

META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”

in the header of a site or a page, you will tell the robots not to index that content or any links that follow. The reasons for and against using such a directive are almost identical to the robots.txt file.

Rel=nofollow

Again, the nofollow attribute tells the robots not to index links that fall on your page. The main reason to do this is if you have a problem with spammers. If, however, you curate your site carefully and have intentional link hierarchy, then you don’t want to prevent the search engines from following and indexing your links.

Rel=canonical

Remember how search engines hate duplicate content? Unfortunately, having different links that reach the same content, or links that vary slightly (consider http://freenosejobs.com versus http://www.freenosejobs.com) can mislead the robots into thinking this is happening. Avoid it by setting a canonical URL for your site with this tag in thesection:

link rel=”canonical” href=”http://www.freenosejobs.com “/

Sitemaps

Sitemaps might seem like Ancient Greek, but all they are is a tool that allows search engines like Google to index your site more easily. Put simply, a Sitemap is what it sounds like: a hierarchical map showing how the pages of your site link and relate to one another. This can really help search engines understand what’s going on in your link hierarchy so that they can help viewers find all of your pages instead of just some.

There are two languages you can use to create a sitemap: HTML (Hyper Text Markup Language) and XML (Extensible Markup Language). While the first can provide a handy tool to the viewer, because they can look at it and also understand your site’s hierarchy, XML is a much better language for search engines. It is expressly for them, they understand it well and they can access it immediately for indexing purposes. If you want your site indexed quickly and accurately, XML is the way to go. Luckily, there’s no need to learn the language: Google offers a sitemap generator to send you on your way.

Webmaster Tools

Though the search engine market is predominantly swamped with Google users, there are two other search engines that still have considerable market share: Yahoo and Bing. While Google and Bing both offer their own sets of webmaster tools and Yahoo does not, utilizing the tools from both of the former will ensure better performance across all search engines.

Google Webmaster tools: Kind of like the Army, Google wants to help websites be all they can be. That’s why they offer webmaster tools, where you can see your site the way Google sees it. This can help you address issues of being found and indexed, as well as keep an eye on what’s actually happening on your site day to day. Find out what search terms are resulting in traffic for you, as well as learning which links – both within and outside your site – are doing the same. Lastly, you can use webmaster tools to share a Sitemap with the search engine. Bing’s toolset allows you to accomplish similar tasks.

Screaming Frog SEO Tool

In essence, Screaming Frog allows you to troubleshoot like a pro without having to pay for the privilege. Via a simply installed desktop program for Mac or PC, it collects and collates SEO information and allows you to search certain parameters, or import them into a spreadsheet to manipulate. You can then use the data to examine your:

  • Internal Links
  • File Sizes
  • Pictures (Alt Attribute)
  • Page Titles (Length, duplicates, etc)
  • Meta descriptions (Length, duplicates, etc)
  • Meta Keywords (No longer reccommend, but allows you to see what pages might still have them present)
  • Redirects
  • Find similar URLs that my need the Rel=canonical
  • and so much more. 

SEOmoz Open Site Explorer

This is a great tool (OSE) to figure out how to get more inbound links to your site. Not only can you find out which sites are already linking to yours, you can see who is linking to your competitors. That will give you a great heads-up as to who on the market might be willing to link to you, so that you can start building a relationship with them. Even if you can’t, you can still take a look at the links your competitors are receiving and try and duplicate their approach (though never their content).

WordPress SEO by Yoast

For WordPress users who care about SEO, no plugin is more useful than WordPress SEO, enabling you to easily update titles and descriptions, as well as editing the robots.txt file, managing your links and meta tags, creating sitemaps and optimizing for RSS.

Do You Know Any Must Have SEO Tools?

We plan to keep adding to this list as new tools and ideas come up. Since most SEO’s have their own specific tool set we ask you to help us in creating the ultimate list of must have SEO tools. Head on over to our Online Marketing Experts page and fill out the form at the bottom of the page!

 Check Out Lesson 13 – How To Measure SEO

Top