At the most basic level, a content audit is a review of the information on your website. Working page by page, you assess how your content is performing against qualitative criteria and quantitative metrics. The objective is to understand what content you have, what state it’s in, and what you can do to improve it.
It sounds straightforward. And in principle, it is. But spend five minutes planning one and the complexity starts to reveal itself:
AI offers a very appealing response to this complexity. There’s loads of tools out there now that promise to crawl your site, flag errors, and spit out a report in minutes. So why would you bother doing a content audit the old way at all?
The honest answer is that those tools can be useful – but they have limits. They can tell you a page has too few words, but not whether those words are accurate. They can flag a low traffic page, but not that this page is there for a small, high-value audience so low traffic is absolutely fine. You need human intelligence for those kinds of insights.
If anything, all the change in the last few years makes a content audit more valuable, not less. You need to know what you’ve got, whether it’s still working, and where to focus your energy. Plus if you want to optimise your website to be found and referenced by AI search and large language models (what’s being called generative experience optimisation or GEO) a content audit is a great first step. That’s what this guide is for. It will walk you through the process, from planning, to execution, to turning your findings into action.
Paid product
This guide covers the essentials, but if you want a complete, step-by-step system for auditing your website content, the content audit toolkit has everything you need – from your first inventory to your final report.
Find out more about the audit toolkit
Most websites grow in the same, organic, unplanned, slightly chaotic way. Someone adds a page for a campaign. A team member leaves and their content stays. A strategy changes but the old pages don’t. Marketing absolutely must have a microsite. Years pass. Before long you’re sitting on thousands – or hundreds of thousands – of pages, and nobody is entirely sure what’s working, what’s embarrassingly out of date, or what’s quietly doing damage.
A content audit changes that. It gives you an honest, evidence-based picture of what you’ve actually got – not what you think you’ve got – and a clear sense of what to do about it. In my experience, most organisations are surprised (both happily and unhappily) by what they find.
The benefits tend to fall into four areas:
Content audits aren’t one-size-fits-all. You need to choose an approach that matches your goals, resources, and timeline. In my experience, there are 4 main ways you can approach an audit:
A comprehensive review of all (or nearly all) pages on your site. This works well for smaller sites (100-500 pages), or when you need a complete picture before a redesign.
A narrow audit of pages related to a specific user journey, product, service, or campaign. Use this when you’re launching something new, optimising a particular section, or working with limited time.
An ongoing process where you audit a few pages every week or month. This suits organisations that want to maintain content quality continuously rather than in big, infrequent projects.
A representative sample of content from across the site. This works for large sites (500+ pages) where a full audit isn’t feasible, or when you need directional insights quickly. A well-designed sample of 100-200 pages can reveal patterns across thousands of pages.
You can combine these approaches. For example, you might use a tool to do a quantitative-only full audit of all your content, a qualitative sample audit of your highest-traffic pages, and then move to a rolling audit for ongoing maintenance.
Choosing a sample of content is tough, and it’s one of the things I get asked about most. It’s part science, and part art. You can use sampling methods and make some rational choices in advance about what to focus on and what you want to cover. But you’ll also want to leave yourself enough flexibility to dig into anything interesting or surprising that comes up.
Some different sampling methods you can use are:
Most effective audits combine several methods. You might audit all top-traffic pages, a representative sample from each section, and all pages along your top 3 user journeys.
If you’re an experienced content professional, you could probably look at a page and immediately identify what’s not working. But instinct alone won’t give you an audit you can stand behind. Without a defined set of criteria, your judgments will shift from page to page, your scores won’t be comparable, and it becomes very hard to explain your decisions to stakeholders. Criteria give you a system. They give you objectivity. And they make the difference between an audit that drives change and one that just reflects your opinions on a given day.
I use 10 heuristics – shortcuts, rules of thumb, guidelines – to help me evaluate content quality:
You can read more about the thinking behind these heuristics here.
It’s a good idea to build out guidance on each heuristic you want to use, to give yourself key signals and red flags to look out for as you audit.
Once you’ve chosen your heuristics, you need a consistent way to apply them. Rather than a simple yes/no, I recommend scoring each heuristic on a 0-3 scale – it gives you enough nuance to distinguish between content that’s genuinely failing and content that just needs a light touch.
Those scores then feed into the most important decision of your audit: what to do with each piece of content. Everything gets categorised as one of the following:
Before you can audit content, you need to know what content exists. A content inventory is a complete list of your website’s pages, along with metadata like URLs, titles, publication dates, and performance metrics.
The fastest way to create an inventory is to use a crawler like Screaming Frog, Sitebulb, or one of the newer generation of AI tools. These tools spider your site and extract data from every page. Some tools can pull in data from other platforms like GA4 and Search Console, calculate readability scores, spot missing alt text, and lots of other very helpful things.
You can also export a list of pages from your CMS, or from Google Analytics or Search Console, if you don’t have the budget for a crawler.
The goal is a master spreadsheet where each row represents one page, and the columns contain all the data you need to make informed decisions. This inventory becomes the foundation for your audit.
Here’s where most guides make it sound easy. Open each page, score it against your heuristics, move on. And yes, that’s essentially what you’re doing – but the reality is that auditing is slow, repetitive, and surprisingly tiring. If you go in expecting that, you’ll be fine. If you don’t, you’ll burn out halfway through and end up with a half-finished spreadsheet that haunts you for months.
Before you start, set up your audit spreadsheet. It should include:
Then work through row by row, page by page:
I find that it takes me an average of 10 minutes to audit a page. Some pages will only take a minute or two, others will take 30 minutes or more. That means that looking at 100 pages in detail might take you the best part of two full working days.
Here are a few tips that help me maintain quality and sanity while doing this detailed, repetitive work:
I’m including this section with a big caveat: AI tools have real limitations when it comes to content audits (and everything else, tbh). I’ve tried out lots of different ideas and prompts, with varying degrees of success. The issues I’ve run into are:
That said, here are some things that have worked reasonably well for me, and that you might want to experiment with:
AI tools might help you get more out of your crawler exports, CMS data, and other sources. Rather than cleaning data manually, you can ask an AI to:
Example prompt:
‘Here is a spreadsheet of pages exported from our website crawler. Please identify any pages that are missing a title tag, H1, or meta description, and flag any duplicate title tags. Return the results as a table.’
Note: Tools like Sitebulb already do a lot of this, and can handle large page volumes with ease.
AI might be able to help you spot patterns that you miss when reviewing pages one at a time:
Example prompt:
‘Here are 30 pages from our website, with their titles and a brief description of each. Please group any pages that cover similar topics and could potentially be merged, and explain your reasoning.’
If you’re doing a sample audit, deciding which pages to include is one of the most time-consuming parts of the process. AI might be able to help you build a sampling plan based on your site structure, audit goals, and available time.
To get a useful recommendation, you’ll need to give the AI some context upfront:
With this information, AI can identify which content types are templated (where auditing one or two examples covers all similar pages), suggest how many pages to audit from each section, flag content that can likely be excluded, and give you a realistic assessment of whether your goals are achievable in your available time.
Example prompt:
‘I’m planning a content audit and need help selecting which pages to audit. I have [X] hours available across [X] people. My audit goal is [taking stock / focused audit / redesign prep]. Here’s my site structure: [paste your spreadsheet — columns: URL, Section, Subsection, Page type]. Please recommend how many pages to audit from each section, identify any content types I can assess as templates, flag content I can safely skip, and tell me whether my goals are achievable in the time I have.’
AI might be able to help you pull your findings together once the audit is complete:
Example prompt:
‘Here is a summary of the results from my content audit. [Paste summary data.] Please draft a findings section that identifies the three or four most significant patterns in the data, and suggest a list of quick wins based on these findings.’
Finishing your page by page audit is probably going to feel pretty satisfying. But this isn’t the end – it’s the start of stage 2. That spreadsheet full of scores and notes isn’t an insight – it’s raw material. The analysis phase is where you turn that data into something you can actually act on, and it’s worth giving it proper time rather than rushing straight to recommendations.
Start with the big picture. Which heuristics scored lowest across all your content? Are the problems concentrated in one section or spread across the whole site? A simple pie chart showing the distribution of keep/improve/delete/archive decisions, or a bar chart of average scores by heuristic, can tell a compelling story very quickly. And it will be useful when you come to present your findings.
Then try cutting the data in different ways. Do blog posts perform better than service pages? Is the About section stronger than the Resources section? These comparisons reveal where you should focus improvement efforts and – just as importantly – where things are actually working well. It’s easy to forget to say that in a report, but you should take any opportunity you can to highlight good work.
One comparison is particularly worth making: the relationship between content quality and performance. Do your highest-scoring pages also get the most traffic and engagement? Sometimes yes, sometimes no – and both answers are interesting. High-quality content that nobody finds suggests a findability problem. Low-quality content that drives significant traffic means you need to improve it carefully, not delete it.
Finally, go back through your notes. Scores tell you what, but notes tell you why. Look for recurring themes, patterns you noticed across multiple pages, and anything that surprised you. Group similar observations together. That’s where your most useful insights will come from.
The most thorough audit in the world is pointless if nothing changes as a result. And getting recommendations implemented requires just as much thought as doing the audit itself.
Start with the recommendations themselves. Vague ones like ‘improve content quality’ won’t get you anywhere. Good recommendations are specific, explain why the change matters, and make clear who should do what. I use this formula:
In order to [outcome], we should [specific action] because [evidence/reasoning].
For example, rather than ‘improve product pages’, you’d write something like:
'In order to improve organic search performance, we should add structured product information to all product templates, because 73% of product pages are currently missing this information and pages with complete product data rank on average 12 positions higher.'
Once you have your recommendations, prioritise them using an effort/impact matrix. The idea is simple: plot each recommendation on two axes – how much effort it will take to implement, and how much impact it’s likely to have. Where a recommendation falls on that matrix determines how urgently you should act on it. I use a version of this called the How/Wow/Now framework:
When it comes to presenting your findings, tailor what you share to who you’re sharing it with. Senior leadership need the headline story: impact, risk, and resource requirements. Content teams need the detailed recommendations they’ll actually implement. Keep your quick wins visible and start delivering them early – it builds credibility and momentum for the bigger asks.
Finally, assign ownership to every recommendation. Without it, even the best recommendations become suggestions that sit in a document and go nowhere.
Content audits aren’t one-and-done projects. Content degrades over time. Sites grow. Strategies evolve. User needs change. If you don’t build in a way of keeping on top of it, you’ll find yourself back where you started in a year or two – sitting on a pile of content nobody is quite sure about.
The most sustainable approach is to make auditing part of how you work rather than a big, occasional event. Larger organisations might do comprehensive audits annually or every two years, with focused audits in between. Smaller teams often find a rolling audit – reviewing a portion of content every month – more manageable. Either way, the important thing is that it’s on the to-do list and someone owns it.
The other shift worth making is upstream: building your audit criteria into your content creation process. If you know you’ll eventually measure content against these heuristics, it changes how you create it in the first place. That’s not a small thing. Done well, it means your content starts better and stays better for longer.
A lot has changed since I first started auditing content, and a lot has changed since I first wrote this guide in 2022. The web is harder to navigate than it used to be – for organisations publishing content and for the people trying to find it. But that’s exactly why this work matters. An audit won’t fix the internet, but it will give you something increasingly rare: a clear, honest picture of what you’ve got, what it’s doing, and what to do next. That’s worth a lot.
If this guide has given you enough to get started, the free content audit planner will help you structure your approach before you dive in. Get the free content audit planner
And if you want a complete, step-by-step system – templates, scoring guides, worked examples, and everything else – the full toolkit has it all. Find out more about the content audit toolkit.
First published: 12th July 2022
More Stuff I Do
More Stuff tagged content audit
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.