Why should you worry about a website content audit?
A website audit is typically done to make sure every resource you have on your website is setup correctly and efficiently. In the past you only had to worry about audits when you were moving your website but in today’s world there are many reasons why you should do this on a regular basis.
- Search engines like Google regularly change their ranking algorithms so you need to make sure that your content is presented in the best way.
- Google also is favoring quality content more due to issues with dodgy back-link creation.
- Issues with duplicate content (more about this later) can harm your search rankings.
I mention Google a lot but other search engines have similar algorithms so if you get your content in shape for Google, you should be okay with the rest. You will rarely get notifications from search engines if they have issues with your site so it is up to you to discover them. If you do get something like a manual SPAM action applied to your site it can takes months if not years to get it lifted. Better to do regular audits than get hit with something like this.
What tools do you need to audit your website?
When I comes to doing an audit you need to simulate the Google website crawl process as close as possible. Their spiders crawl the web on a regular basis looking for changes to content and they update ranking accordingly. A crawl of your website creates an inventory of what is on your site, an audit in other-words. I recommend three tools to do manual crawls and you should use all three.
- Google Search Console (formerly Webmaster Tools)
- ScreamingFrog SEO Spider
- SEMrush or other online web crawling service
The Google Search Console will not give you a lot of details but you should check it for crawling errors. You should resolve these first before running any other web crawling tools.
If you have a large website you may need to pay for ScreamingFrog or SEMrush. No one tool seems to find all problems which is why you should run at least two. They do provide free services for websites with low page counts. Once you run a crawl you will get a report showing you where you have issues. You can also get a list of all the good content which is useful if you want a complete list of all resources on a website. A sample high level report is shown below.
Common issues found during website audits
The output of the crawl tools may differ depending on what search engines are demanding. In 2015 the most common issues include
- Images with no ALT attributes
- Duplicate content
- Broken links
- Issues with headings and page titles
Most of these are easily fixed, you just need to spend time setting unique titles on each page and making sure each image has a text description. Remember that search engines cannot ‘see’ images so they rely on the descriptions that you give them to understand what they are.
One of the newer problems is duplicate content. This first became a problem with the Google Panda Algorithm update. Recently Google covered the subject on their weekly webmasters hangout, the general advice is that you should check your site for duplicate content.
While you may think your site could not have duplicate content your content management system (CMS) may create extra pages based on existing content. An example of this on WordPress would be a Blog index page which is automatically created. If your blog posts are really short, this content my be duplicated on the index pages. Tools like SEMrush will highlight this.
Other issues outside of your control
One of the main things that comes up during website audits which are a bit out of your control is back-links. A backlink is a link on another site which is linking back to your content. Most of the time these are good and will increase your website ranking especially if the link is on well known website. Read up on follow and unfollow links if you are interested in building up links to your website.
You need to watch out for links associated with link farms or anything which generates hundreds of backlinks from a single website. You should learn to protect your site from link SPAM which may result in negative SEO (drop in rankings). If you spot a suspect link, contact the webmaster of the website and ask them to remove the link. If you are not successful or if they ignore your request you can use the Google Disavow Tool. Be careful with this, Google want to do your best to remove links. Improper use of the tool could also result in the loss of good links.
Watch out for issues due to site speed and mobile device optimization. A lot of your website visitors may be viewing your content on mobile devices so it needs to be a responsive design. Large images can take a long time to download so you should look at reducing their size or saving them at a lower resolution setting.
The best advice is to run a website audit a few times per year, most tools like SEMrush can do this automatically and send you reports.When creating content you should not focus too much on trying to target a specific keyword. Create posts and articles of 800+ words or more and focus on a particular subject.
It’s also worth tuning into the weekly Google Webmaster Central mentioned above. You can ask questions and get good tips for keeping your website in good shape.
Do you have any tips for running website audits? Comments welcome.