Probably the most common request I get in the SEO field is: “Can you check my website for errors and problems?”. People often seem to be paranoid that there may be some feature of their websites causing the ranking to suffer, or that some secret piece of code is hurting their ranking ability. Usually this paranoia is unfounded, and they simply need better content, more links, etc.
But in some cases, I’ve found that there can be problems with sites that hurt the ranking. Usually these have to do with the ability of a search spider to crawl a site. I call this “Search Engine Friendliness”.
The Most Common Website Errors
The most common errors that lead to search engine ranking problems are:
- Use of flash and java – the search engines can’t read text that appears in these website features.
- Error in robots.txt – some site developers try to get fancy with this to allow or disallow certain search engine spiders but often make mistakes causing the site to turn away the search engines.
- Use of noindex in the meta tag section is meant to tell search robots not to index the page. This is usually not intentional unless there is a developer that doesn’t know any better. I’ve seen cases where plugins in wordpress accidentally turn this on.
- Spam links can often times draw a penalty from the search engines. This is another one that isn’t always intentional. Sometimes a site can get secretly hacked and spam links placed in secret hard to find places without the owner knowing. This can often times cause a search engine penalty as sites you link to can be viewed as an association or endorsementthr from you.
- Large database driven sites with poorly written URL structures can sometimes run into ranking and indexing problems because the search engines can’t navigate dynamic pages and URL’s very well.
The 5 Step Website Error Checkup
So how do you know if you have any of the above problems, and how do you check for common errors? Try these 5 steps:
- Step 1 – Do You Have a Penalty? Has your search engine ranking suddenly dropped in a drastic way? The best sign of this is a drop in search traffic. Did you go from 2,000 search engine visitors per month to 0? There is most likely a search engine penalty in place that dropped you from the rankings. Proceed to step 2…
- Step 2 – Do you link to any spammy websites? This coule be intentional or unintentional. Either way, it could bring a penalty… and if you are seeing problems like in step 1, you may want to consider checking your outbound links. I use 2 tools for this that both work very well, the Link Validation Spider, and Link Slueth (more advanced). Both are mainly for checking for broken links on your site, but can be used to find hidden links and spam links. You might also try a simple query on MSN. Visit MSN.com and type this in the search box: “linkfromdomain:www.yourdomain.com”. It will search for all outbound links from a domain.
- Step 3 – Don’t Drive Away the Search Spiders. More times than not, if someone is having serious ranking problems it can be due to a robots.txt or a meta data problem like the use of noindex. Try this tool from Submit Express to test for meta tag issues. To check for robots.txt issues, try this robots.txt checker. Make sure to use the address for your robots.txt file (Example: www.searchingsolutions.com/robots.txt)
- Step 5 – Check For Meta Tag Duplication. This isn’t a problem that can lead to penalties or problems with indexing, but can seriously hurt your ranking. Page titles should always be unique as to avoid keyword and page cannibalization. Seomoz has a great Crawl Test tool that can help in this area.
I hope these 5 steps are useful. You may not need them very often, but when you do you’ll wish you would have bookmarked this post… 😉