Transcript for:
Website Content Audit and Optimization

so a quick recap what we're trying to do here is determine the list of pages and the tool we're going to use to determine that is screaming frog so if you have less than 500 pages pretty straightforward you can use that if you have more than 500 pages just having a tool like screaming frog would really save you some time another alternative when we keep licenses for both the site bulb I actually find from a technical SEO standpoint site bulbs almost a little it's a lot more powerful and it's a little bit it's a little bit more user friendly you can get a lot more insights a little bit quicker with site Ball but having the boat tools going is pretty strong play I think site bulb runs about 50 bucks a month uh we're screaming frog I want to say 200 I know they do it in pounds and I haven't done the exchange rate and I paid for it January so uh it's not not a whole lot um and it you only you're only paid annually so uh crawl with screaming frog and filter out for the the slash blog and then like I said remove taxonomy pagination Pages author Pages Pages um that won't really necessarily return uh generally don't rank um and what uh another hack that we'll use like um is in in screaming frog you can uh load up a page of URLs and one thing we've been doing just so we don't have to filter down is if you've got something like rank math or Yahoo surrounding you can go to your site map index and pick the post it's XML sitemap and crawl that instead of just crawling all the URLs in one go and then filtering down so this will give you all the pages in here uh you know not a pile of content there's 18 pieces of content the only one I would filter out is a slash blog because even if we were to determine that root page the slash blog did not drive any traffic it's still the index of all the pages so it's still neat it serves a purpose outside of just SEO or a general traffic and when you're uh I'll just quickly share uh what screaming frog would look like if we were going to load this up so what I've done is I've taken my license off of screaming frog so you can actually see uh what it looks like if we were to uh um if we were dealing with a non-paid version so we've switched this over uh you can't see it right now but on the Mac at the top you can switch your mode from spider which is the defaulted list and then we're going to download from XML sitemap we're going to paste that post XML sitemap it will include a lot of the images because they're referenced there as well but that's really straightforward to filter out because when you crawl on the right hand side again this is the free version you can do all this with the free version you don't have to pay at this point if you've come down here the more or less I have these filters you can go down to internal click HTML and you're only going to get blog posts if you go back to all you're going to see all the the individual uh the images that are included as well but we're not really digging into those so we're just going to export that HTML information so we're going to export this information and stop sharing share back on the other screen here so we're going to use this screaming frog crawl not only just to determine this list of pages but also we're going to feed other information like the H1 um you can plug in things like title of description which I don't have filtered in here right now but if you're when you're getting ready to refine you can you can do it with that so if you what I tend to do is export this and plug it in to SF raw is the the tab I'm using I will post uh my steps and I I refer to everything by what I have them labeled as so this list of URLs we're going to copy this list another option is to do what a unique spreadsheet and I'll take that just a really simple Excel or Google Sheets um formula and what it'll do is pull out all the unique URLs in here so if you had any that were overlapping with your information technically doesn't have to have that doesn't tend to happen with screaming frog um but this uh really um this will just pull the list of those URLs and then as well we can start referencing that data so this is the H1s we're pulling them I'm using the more or less as a label so I can quickly identify that if I was talking to somebody above me that doesn't know the URL slug or care about the URL slug they will remember the SEO URL structure post if I was to mention that instead of saying you know the blog post at slash SEO Dash URL Dash structure just makes a little bit easier for uh talking up uh this type of information and we can pull any information here common stuff will polls title meta description anything that's going to show up in the screaming fraud Rock crawl will show up so here's your list of titles in here if you want to get into title length see which ones are too long meta descriptions are all over here there's no meta keywords which is a good thing our first stage one you can see the additional heading structure will come in here as well if you ever need to parse any of this information but for this exercise we're just using the the title or sorry the H1 any questions so far is it all making sense okay been told to talk fast so I'm trying to slow things down and taking posits here uh all right so we got your list of pages we're initially pulling in this H1 data just to have it um the the next step would be let's get some some free data to make some informed decisions so a couple pieces of information I like to pull um I'll go into date first uh so date is it's a date stamp that's actually plugged into the XML sitemap so what I ended up doing was um was importing the XML site map I'll show you what that looked like [Music] it's a blunt excuse me so the formula for that was this and I'll I'll keep plugging these into the chat and what that did was extract the XML sitemap and break out the individual tabs that we're paying attention to so you've got date stamps on here and the reason why we want to look at date stamps is a lot of this so I pull a year of information uh usually to compare if something went live and this one went live November or had the last modification November 8th last thing we want to do is kill it and we can't compare it against something that's had 12 months just to to exist through the blog post so it may not have as much traffic but it hasn't had a time time to really mature so we need to that's one of our considerations when we're looking at these summaries as was this content published recently um are we we don't want to kill it if it doesn't have excuse me it hasn't had a chance to perform um so that that uh that data gets pulled up in here um I'll quickly filter through particularly if I'm dealing with like a lot of blog posts and look for anything that's like older than 2019 so it's really you can just do a finding or a a filter down for anything consider that contains two zero one and I'll give everything from basically 219 before and that's a an early like a quick way to find out the old post and see how they're performing so Mike sorry just to clarify anything from 2020 onwards You're Gonna Keep it regardless of how it's performing no but it would it would have a better chance of surviving if it's 20 20 onwards So when you say recent like if you've made tweaks to it recently or you've published it recently what is recently for you three months six months I would say three months but it would all depend on the data set you're looking at something that's existed for 12 months uh comparing it to something that's existed for only three it's not really fair um so if it's just another factor that would tell me that it's not ready to go um it still needs a little bit more time to perform it's one of the last ones I look at I like to look at traffic conversions um and Authority earlier on because I find those are a lot better indicators than just date but I want to make sure that dates considered in there as well so we're not killing something like we just posted a month ago or the we've had we've had Juniors work their way through this and they want to kill a blog post that went live like a week ago and that's the last thing we want to happen right foreign great question though thank you um the next uh concept I like to look at is search console so uh and mainly clicks so how much traffic we're getting from the search result and this is before being off anybody's opting in for tracking and then impressions as well so how how often are we showing up because we could be not getting a lot of clicks but we're getting a ton of visibility and we really want to lean into um there might be opportunities for us to take that visibility that we're showing up in search results but we're in the eighth position if we got that the fifth position that's going to really increase traffic we get in the top three it's going to really amplify that traffic this data is pretty straightforward to pull uh we can just come into search console and we click on um the performance report uh search results here you get your data in here I'll go for the last 12 months um so we're we're accounting for a little seasonality there particularly if you're dealing with e-commerce and they've had they have a spike in Christmas but then if they sell Christmas widgets they're not selling a lot of them in April so uh we were looking at six months and it did not include Christmas it wouldn't be a fair comparison uh so we're looking at the last 12 months and I do this little custom regex mainly to filter out the uh all the blogs um that it filters out the blog root but it keeps all the blog structure in there so if it's got a Blog slash blog you um and then uh the slug afterwards it'll be included here if there's no slug after that so the root page then it wouldn't include that now where things get a little messy here is if you don't have this subfolder set up um for blog content and I highly recommend it it has no real SEO value other than being able to go back and look at from a macro level and spot bigger Trends and being able to segment this information a lot cleaner if you're if your blog hasn't really evolved or you're not getting a ton of traffic uh this would be the time to switch the blog structure so you can pull parse it out a lot quicker and be able to make bigger decisions around the blog posts um and it just for these types of exercises makes it a lot cleaner you can use tools like screaming frog and look for Unique identifiers that are set up for the the blog but it's getting really kind of advanced and I wouldn't recommend it so when we get this information out we've got all the blog posts here we're doing about a a year of data we'll export that and the output is this and what we want to do is pull the pages tab so you're going to get a bunch of queries search appearance but what I really care about what pages how many clicks are getting what kind of Impressions they're getting that information will get copied over into GSC Raw um and then we use some vlookups in here to look up this this information and if it's got no traffic to tab it it's not not traffic then I do um some conditional uh formatting in here as well so I can quickly spot what the top performing posts are and which ones are not so like head add keywords to website kind of take a look at this one because you'll see it's got like um over a hundred thousand visibility it's it's the gets the most um visibility and search results but it's only led to 77 clicks so there's probably a big strong opportunity for us to uh to refine this and get it into a stronger position or to pay attention to how it looks in the click through um how it looks in the search results and how can we increase that click-through rate any questions about getting the organic traffic clicks and uh Impressions into the spreadsheet all right moving on uh conversions now this can be a mixed bag with our client base sometimes they're not tracking conversions in ga uh I pulled data from GA um and we're not tracking conversions honestly so I did some fake goal completions in here if you're wondering why the conversion rate doesn't track uh with the completion right just so we add some numbers to play with um we have clients that'll be pulling this data in from multiple sources be it like some sort of CRM or anything along there where and then we'll pull in things like mqls and mels and use that information and you can basically scale this up however you need uh see fit so just by copying this over and copying the structure we can we can create multiple conversion buckets if we want to look at it from that standpoint but for simplicity's sake I kept this one simple we faked a little bit of conversion data um and uh we basically went into Google analytics um and looked for landing pages and filtered down for blog and then and why we did landing pages we want them on that first point of entry and then as well how many uh and it also has goals so when we export this list we get all that information um now one thing you need to pay attention to is it will only export how many rows you have shown so if you look at this number down here and you see 60 okay I want to show rows up to 100 so my one CSV export will work for listing all those pages and then we pull that information and we plug it in here now there's a interesting thing that makes uh lineup GA and GSC data so gab and Google analytics is GA tends to not include the domain name so there's not a clean vlookup of the information so I have to do a little bit of fancy fanciness to make this work and I'll copy this formula over into the chat as well and the cool thing about sheets is and I mean maybe Excel does this I just never work in it we can do a regex replace and we're basically telling it to look for and if you're not familiar with vlookup so we're vertically looking for this URL um but we're replacing my domain with nothing so that it'll it'll line up with the way Google analytics is pulling that information out so then that makes the variables line up and then we can pull that information in oh sorry I want to go on conversions that's what I was looking at and then that information uh so we're pulling from A2 to I I is the last one uh that has the completion data and then we can pull those numbers in there so last thing we want is something that's converting uh that doesn't have a whole lot of traffic say this post uh we wouldn't want to kill it because it could be high intent bottom of funnel traffic it's just not going to drive a whole lot of queries it's not a lot a ton of traffic but the traffic that's coming is qualified and the other thing I will say is ga4 does not do the landing paid attribution as cleanly where you can get that conversion data so this aspect of tracking conversions will not work when if you're pulling from ga4 data so you're going to have to come up with another way of dealing with that or if you're tracking conversions in a different way I have been exploring the product Heap Analytics as a potential replacement for my own personal site and the thought I think that I went in the chat um great team doing some really cool stuff and uh they have a free package at the 10K monthly searches so if your personal site's not driving a whole lot of traffic it could be an initial uh starting point to to work with um now for our clients they're they've got their own systems in place they're either using ga4 or combination of this or they're already using Heap to track this kind of information okay all right and then another aspect we'd like to look at so I've talked about traffic we've talked about visibility we've talked about conversions is Page views so again we're using a Google analytics metric and what we're trying to determine here is whether or not page has some other purpose that we may not be aware of uh for an example this page right here every decision is American decision by Joel Kelly had 800 page views so even though it only drove three organic clicks it's part of the user's Journey so we don't want to necessarily kill this now to pull this type of information in we're back into Google analytics again but we're looking at the all pages report now I'll let this load we're filtering down for blog again we've got a Year's worth of data and uh and then we're gonna we're gonna make this big enough so we've got 63 here we can go up to 100 we'll export that information and then we'll pull it back into here we're gonna put this one in one called GA raw if you want to reference any other any of the other information you you're more welcome to but I'm looking at page views here I want to make sure that we're not killing something that is is potentially drives um is part of the user's experience might have another purpose did it drive a pile of social traffic that we didn't we weren't aware of we don't want we want SEO to inform things but we don't want SEO to take over all right so that's page views any questions about that all right the next two things I'm going to look at is Authority and top keywords I've used hrefs data for this information um it's a pay tool I think it's about 100 bucks a month us you can do a lot of this work manually I do know that links is part of search console I find it's not as accurate as ahrefs is but you can export this type of information and cross-reference the amount of links coming in but the number of links doesn't necessarily equal The Authority that a page would hold and the thought process behind all this is if a page has gotten Authority um but it's not necessarily driving traffic it's serving another purpose um you can be transferring that Authority from that page to other pages so it it's it it has a purpose that's just not apparent uh at first uh site so we don't have a whole lot driving uh Authority here our URL rankings um how to add keywords to a website as one and then our Google search console uh performance for beginners post are the two that are driving some traffic both of them look to be performing as well so we wouldn't want to kill those um and to pull that information so um ahrefs so uh the place where the best Pages by backlink so if you're in ahrefs you come down here uh Best Buy links and then you just export this information why you want to use is this URL rating it's what what strength does your page have does this one page on your website have higher it is the better uh the other one I like to poll is referring domains because they can let you know how much how many backlinks there are that are pointing back to this page um and then the other piece of information I like to pull from ahrefs is top keyword and the thought behind this is if we've got a page that's in the top let's say Allison wants to create a website with the clickable elements with the top keyword which is really Iran for a guest talk from Allison K Consulting and it's in the 13th position now I don't necessarily think I'm going to optimize for this one because I can't see the value on it but um just because that query is not so big I might want to filter that one down and look a little bit deeper but if you're dealing with a smaller data set you can also take this information back and backward engineer it into search console to find out what kind of keywords are driving it but first I'll run you through how I pull that information from ahref so we can pop that up and just oh and I don't think I showed you but this is the authority export that we're using for uh ahrefs and the next thing we'll get into is the raking raw so we're looking at top pages if you want to get fancy you can start filtering out if you really only care about the US you can put this down from all countries to just the us or Canada or if like they're very geographical if you're focusing on India you can filter it out for that and what it'll do is it'll pull out once and pattern there it'll get rid it'll only focus on the ones you want to focus on we export this information and we plug plug it into here and then we can do the same kind of vlookups to look up this is the unique identifier fill in the top keyword uh volume and then position but let's say we don't have ahrefs data and we really want to know what something's performing for so I'm just going to run this one through and what I would recommend you do is we're gonna get rid of some of our filters here let's just add this page in here still going to look at 12 12 months if we were to look at queries we can find out which ones are driving the most traffic so GSC report search console performance report GSC performance or both are those are the three big ones that are driving it um you can click up here and get average position we can find out what position they're in now this is averaging at worldwide so we do want to take a look consider countries in this as well we focus a lot on the American market so I will by just clicking on these you can end up filtering down for that specific country so I've got it filtered now for the US and I'll jump back into the queries and now we're finding out GSC performance Google search console performance report these are the two ones that are driving the most traffic in in the US we're showing up a lot more from the GSC report but we're also a lot higher than positions so we're like fourth position um this one we might want to pay attention to because we're in eighth position um actually check this and uh in another tool and we had hit first now this is looking at the last 12 months what I find with rank you really want to look at like the last seven days to get an idea because it's going to aggregate all that information over the last 12 months so how are we performing in January it's gonna inform that average now looking here Jesse performance not looking as good as it would did before but uh 5.2 so we know we're in fifth position on average in the U.S for this page in the last seven days and we can get that into the top three it's really going to drive some traffic so I wouldn't want to kill this post I want to optimize this particularly when we're looking at something like a fifth position no worries Kenny thanks for dropping by man um is when you're in fifth position getting up to the top three is really just going to amplify that traffic um so yeah that's how I would dig into keywords if we were dealing with a smaller set of information uh if we were again dealing with something like more than 500 posts then I would uh do this on scale with something like aatrefs mods and semrush all have equal reports that would uh would do the same thing any questions so far foreign I have a question Mike yeah um when you say you're figuring out which ones to kill do you mean totally taking them off your website or why what are you killing and why are you killing them I should stop using the word kill um Purge how's Purge no it means the same yeah we're we're basically deciding which posts we want to get rid of and that was the point of this exercise it's gone through and finding the deadweight content uh is another common term we'll use for this this uh post so uh for this kind of exercise so great question Kelly because now we're transitioning over into what posts do I want to get rid of in 2023 um so if I was to look at this kind of information I could start filtering down uh it was let's say just for sake argument I'm going to say filtering there we go uh we want only things that have driven some sort of Click so we click off actually let's clear let's see if these posts make sense to get rid of so we've got three posts that have not driven any organic traffic they're probably likely I would I would recommend that we're going to get rid of them and what like one coronavirus it's going to be a little topical uh I think we're I don't think anybody needs to know anything else about remote work and coronavirus we've already been there um how limitations led to creativity uh and managing client expectations on local search results like we don't even deal with local anymore these three posts to me are initial ideas for getting rid of content um but all these other pieces are going to inform that so we want to make sure that is there you there's none of them have a URL rating so they can prop they can go uh no referring domains they've had a little bit of page view so they are part of the buyer's Journey but they're not a big part of the buyer's Journey so I'm going to say we're safe in that fact uh no conversions so they're not driving anything there um looks like we did an update on them in May but not to a point where it's actually turning into some traffic the last thing I would check is do we have a top keyword in a position that we would like to position this now right now I'm seeing now so in this spreadsheet I would say go for all three of these and then I would go through and find a page that makes sense to redirect them to and then set up and and flag that in there and then we could pass this off to some the person that's handling redirects be it yourself or or somebody else if you have to leverage the third party to do the redirects um so we can remove this piece of content and we could redirect redirect it then answer your question Kelly kinda okay um is there any harm in keeping it on your website what's the harm in keeping it there a couple blog posts is not gonna be make a big difference when you're dealing with like 10 000 plus Pages or a couple hundred like we had a Blog a con a client uh back in the end of summer had about 500 blog posts and doing a process like this we determined 85 of them drove all the value for the website so and and there was also um some issues with this because they had a lot of content team kind of run amok with it and they had done kind of if you've ever heard the old classic SEO joke like the SEO walked into a bar a pub a watering hole like they had created Pages for every possible variant on this subject so and Google's algorithm is smart enough to understand that all those things mean the same thing so they hadn't concentrated their efforts so identifying which ones drove the same traffic or the same type of traffic for the same type of queries we found the one that performed the mo the best we took the other ones and had them reviewed by a Content strategist to determine if there's pieces of this content that would could be Amalgamated into the main post that we're keeping then we deleted those other posts let's say there was three in total I mean we redirected the other two to the main one that we're keeping and now we've got a unified approach for those queries and queries to like it so in that case yes having multiple pages that are more or less talking about the same thing was not solidifying our approach and was weakening the signals another argument and I can get seos up in arms is crawl budget I find it only really matters when you're getting into like the 10 000 plus page content sites um but that being said if you can focus Google's effort on what you actually want to rank and what actually performs if you had a 500 page website and you got it or 500 blog posts and you got it down to 80 of these 80 actually resonated with your end users it helps to position your website um as a subject matter expert or the the content on here is is extra content around that main subject okay I understand that and what about the um the tools in ahrefs and sem rush and stuff where you can go and check the most popular pages and it shows a whole bunch of data with that how how accurate is that compared to what you have in your spreadsheet is it similar or is it totally out of whack it's it's all similar um when I pull hrefs data I don't I don't use their traffic indication because their traffic is an assumption based on position and the averages of clicks based on that position so say something got a thousand monthly searches and you had third position for it they would assume I forget the exact ratios but let's say 12 of the traffic went to you that doesn't actually translate where Google search console will give you it's not act it it it's a leading indicator but it's not uh an absolute and something like search console gets you a little bit closer to that because it's showing you who's actually clicking through a based on position Now search console is still sampled so it's not a hundred percent but it's a stronger leading indicator um that things are going to perform so we're using some of those pieces of information but then we're also using pieces of information that we have directly from the sorts search console and Google analytics to inform um our decision making so something in third position that's getting a lot of clicks when hasn't moved up to First Position yet we'll eventually get there or even if it's the bottom of the first page but it just it just sticks out to people and it just resonates with the end user and it's driving people to click through we'll know that that's a driver whereas in something like ahrefs mods or semrush eight position would not look like a lot of traffic and the other thing is if we're in eighth position if we're getting a lot of traffic comparative to how many queries there are for this with time we are going to go up uh at least in our experience we've seen the position increase and that just amplifies that great question thank you um that makes sense okay awesome um any other questions about this processor content audits you talked about which ones to get rid of are you going to talk about um which ones have which ones to tackle first in terms of improving for example yeah yeah great question so what I'll do is I'll go over and I'll take position here as our main filter now um so one thing I'll look at uh we'll filter by condition is between let's do three and eleven foreign foreign there we go so these posts have a top keyword and third to six or six third to 11th position this is telling me it's getting eyes on it because it's in the search results now particularly our clients are mostly desktop based users too so I consider a lot of this in the desktop sir if it was mobile it would probably be only the top three positions what I consider uh optimizing for click-through rate so I would focus on what do the search results for these types of queries look like and how can I differentiate myself we actually looked at this one on Monday and if I was to run GSC report now we do better in Canada than we do in the US you'll see we're up here uh I was thinking about I could break this out into how to schema and get some cards down here and what that would do is draw the eye particularly in the US where we're in more of a little packed scenario I'm not necessarily optimizing this piece of content um I might still explore to see what the other search results look like um but the big opportunity here when it's on the first page but not necessarily driving clicks is getting eyes on it in the search results so it gets more clicks and gets a higher position so that's the first thing I would look at the other thing I look at is I would go from 10 to 20. and these ones we want to optimize A Little Bit Stronger so if we get that Allison one back again Allison wants to create a website with clickable elements if we were gonna opt we were gonna We would optimize content like this that's sitting on the second page and Beyond to try to get it into the first page we'd also look at Authority opportunities to build up the backlink portfolio so that we can strengthen those signals to get it from the second page onto the first page then we get it onto the first page we can focus on click-through rate so we're optimizing to get to the first page and then we're optimizing content and those ranking signals to get into the first page and then we're paying attention to what it's how it shows in the search results to increase its click-through rate when it's on the first page to try to get into a better position and the same argument can be made for a second page and Beyond from an optimization standpoint it's just going to be a longer play so that's how I would prioritize things I'd look at click through rate first because you're going to get some strong returns right off the bat then I would look at Striking Distance so second page onward so we can optimize to get them on the first page and then monitor to see when they hit that first page and then go back and look at click-through rate and then I would then I'd start looking at the ones that are second page Beyond can I ask another question oh 100 please do with this example here you said you'd go back and you need to optimize it more and I'm assuming when it was posted you optimize it are you going back in changing some of what you optimize it for like you choosing a different are you changing something that you chose originally or what are you optimizing now that it's up there or are you just refreshing it uh refreshing it optimizing it but looking to see kind of what like is there exact phrase queries we could go for um is there um other variants of the the query that we can mention in there one thing I will say about this type of content because this is an output from somebody doing a talk we didn't optimize for it other than just general SEO best practices we released it into the wild just as is because the content had already been created and it's also a guest spot if somebody else writing it so it gets a little Messier when we're trying to optimize these ones I don't want to misrepresent what they're doing so if I was to optimize this I would tie this into an authority play as well reach out the original author going hey we're trying to get this post better positioned we're going to change a few things here wanted to see what you're thinking would you like to have some input on this post as well and that way there it's a bit of an ego bait play they're they're part of the content process and they're more likely to share it on their own social media uh when they're ready to go live or when you're ready to publish this new one um but um yeah generally we would optimize in best case scenario as it goes out but if it's user generated content I tend to or like stuff where we've talked to the sales team we're just trying to figure out what the common questions are I would release it as is and go back a month later to see what traction it's getting and work with what Google's associating it with uh instead of trying to set it up at the get-go so you're adding content to it refining it yeah okay depends what the search results look like if it's a 500 page post and there's an opportunity to make this or 500 word posts excuse me and uh there's opportunities to make it more involved or there's other subject matter that the top ones talk about that we don't talk about we would create those as new parts of it or provide direction for people to create it like that okay great question um any other questions so did I hear you right Mike for the for stuff that's showing up on the first page you want to look at which ones have decent click-through rates but what do you do if it doesn't have a good click-through rate we're trying to find ways to increase its click-through rate so how can we draw the eye to it um for like this one for an example I was talking about schema but it could be like are we getting truncated in the results what do the other results look like like our result in here is doing well because it's it's Google's just churning out reports at a glance like this is more or less automated right where this is more speaking to the end user this is geared for people that actually know how to use search console I don't know if anybody's ever used Google support but I find it frustrating if I'm coming in net new it's only really useful if I've already know the product and I'm coming in our our post is being a little bit more a little earlier stage for somebody using search console so how do I read this GSC performance report the fact that it's talking about this how-to aspect of it could be leading more people to click through there's opportunities meet for me to make a video on this and Mark that up with schema we could get a potential thumbnail in here and that thumbnail is going to draw the eye increase click through rate um star ratings are another one that we've leveraged in the past or if you get started rates of the search result you get little yellow stars and a wall of blue and black text on white those little yellow stars really draw the eye so if they're like third fourth position people like oh what's that and then they read more on it click through now with that increased click-through rate that tends to lead to a higher position yep and for the stuff hitting the second page you said look for Authority opportunities you said something else as well I didn't catch uh General optimization I'd be more involved in my optimization taking a look at that content finding out what differentiates us from the pack like what what can we do that the the first page isn't doing what are the first page results look like um what can we answer uh what are the what are they doing that we're not what can we do that they're not I want to like a mixture of all that okay yeah awesome how often do you do the audit uh we've been trying to get into a monthly Cadence or not a monthly story an annual Cadence so this is my time of year to do it December's a slow there's a new content initiative in January this can inform what's going to get refined um tell us what we're going to kill um sorry Purge and um and also let us know a little bit where where we might have gaps and we want to fill those um so it's a good way to kick off January particularly if you come into January with a bunch of refinement projects and people are just getting off the Christmas break and they're not quite into it and they just need to go in and do kind of I call it uh podcast work you got a three-hour podcast and 10 blog post to to do some simple tweaks too you can go in and kind of do more refinements where net new content that finds a little bit more involved requires teams like your yourself Emily or in-house people to to be able to generate this content so uh maybe a little bit less podcast uh focused and more actually podcast listening Focus not that podcast is not a good content initiative um so yeah I like to run them in December when it's a little bit slower and use that information to inform the early work in January as we're ramping up other campaigns and do you very similar when you're reviewing stuff on a monthly or quarterly basis to see what needs to be tweaked I would still look at a year uh just be even if I was doing it quarterly the main reason why I want to look at a year is the account for seasonality it's not fair to look at November and December and compare it to let's say September I don't know that's a really good analogy or not but um like you just don't want to look at the the wrong time frame and kill something that could be performing well I had I did a project a while ago for a client that made um did a lot of recipes and there were a lot of holiday recipes we couldn't look at any information except for from pretty much American Thanksgiving until December that's when all the traffic got driven um but we always looked at the full year too because they do have a few things for recipes in the summer that really would kick things off but I do use a similar one I'll pull in other data sources I like to get three or four data sources I want to simplify this version of it so people could use it without a whole pile of data sources there are a few tools like SEO testing.com that takes the GS the GSC data and will parse it out and give you queries based on page which out of the box search console doesn't do so you can you can I like to have that Source in there along with ahrefs because we can associate the actual clicks for a query and sort by the top query for clicks not just based on position and they're assumed uh traffic amount based on position volume and what those averages are for those positions yeah okay great yeah that's the general gist of how I perform a content on wow I've never come this close to the timing um any other questions either one of you have that's it for me thank you very much Mike my pleasure um awesome to have you both in here and uh I'll let you have eight minutes back of your day um oh I gotta do the general Spiel of like digital Nova Scotia for sponsoring us if you're not