Some of these are obvious and well known, others are obscure and brand new. All of them solve problems – and that’s why tools should exist in the first place. Below, you’ll find 20+ tools that answer serious issues in smart, powerful ways.
#1 – Generating XML Sitemap Files
The Problem: XML Sitemap files can be challenging to build, particularly as sites scale over a few hundred or few thousand URLs. SEOs need tools to build these, as they can substantively add to a site’s indexation and potential to earn search traffic.
GSiteCrawler: Downloadable software to create XML Sitemaps
Download a few files from Google Code and Install on Your Webserver
Looks like Google Webmaster Tools, doesn’t it? 🙂
Both GSiteCrawler & Google Sitemap Generator require a bit of technical know-how, but even non-programmers (like me) can stumble their way through and build efficient and effective XML Sitemaps.
#2 – Tracking the Virality of Blog/Feed Content
The Problem: Even experienced bloggers have trouble predicting which posts will “go wide” and which will fall flat. To improve your track record, you need historical data to help show you where and how your posts are performing in the wild world of social media. What’s needed is a cloud based tracking tool that can synch up with the Twitters, Facebooks, Diggs, Reddits, Stumbleupons & Delicious’ of the web to provide these metrics in an easy-to-use, historical view.
Tools to Solve It: PostRank Analytics
PostRank’s nightly emails keep me wracking my brains for better blog post ideas
PostRank sends me nightly reports on how the SEOmoz blog performs across the web – numbers from Digg, Delicious, StumbleUpon, Twitter, Facebook and more. By using this, I can get a rough sense of how posts perform in the social media marketplace and, over time, hopefully train me to author more interesting content.
#3 – Comparing the Relative Traffic Levels of Multiple Sites
The Problem: We all want to know not only how we’re doing with web traffic, but how it compares to the competition. Free services like Compete.com and Alexa have well-documented accuracy problems and paid services like Hitwise, Comscore & Nielsen cost an arm and a leg (and even then, don’t perform particularly well with sites in the sub-million visits/month range).
If a site has been “Quantified,” no other competitive traffic tool on the web will be as accurate
Since both sites are “Quantified,” I can be sure the data quality is excellent
I’ve complained previously about the inaccuracies of Alexa (as have many others). It’s really for entertainment purposes only. Compete.com is better, but still suffers from lots of inaccuracy, data gaps, directionally wrong estimates and a general feeling of unreliability in the marketplace. Quantcast, on the other hand, is excellent for comparing sites that have entered their “Quantified” program. This involves putting Quantcast’s tracking code onto each page of the site; you’re basically peeking into their analytics.