How much Googlebot time are you wasting?
by, 02-24-2009 at 06:13 PM (7247 Views)
[B]Are you making the best use of Googlebot's time?[/B] If not, you are negatively impacting your SEO. One way to ensure that the time Googlebot spends on your forum is not wasted, is to get rid of inefficient, non-unique content. My focus here is on [B]showpost pages[/B].
While they are not "exact" copies of any other page on your site, all of the post content is already displayed on your thread pages. So they are [I][U]partial duplicates[/U][/I].
[B]If you want to find out how to remove them, check out this article:[/B]
[I]Before you go ahead and do it, you should understand how much of Googlebot's time you are actually wasting. Let's do so first with an example.[/I]
Let's pretend you have a forum with:[/B]
[LIST][*]10,000 threads[*]1,000,000 posts[*]20 posts max per thread page[/LIST]
In this example, including multi-page threads, we could [I]guestimate[/I] there would be about 50,000 thread pages in total. [U]You can determine this exactly using your sitemap generation reports[/U].
What percentage of total pages are actually thread pages with unique content?[/B]
[LIST][*]50,000 / (1,000,000 + 50,000) * 100 = [COLOR=Red][B]4.8% Unique Only![/B][/COLOR][*]In this example, the crawling efficiency is ONLY 4.8%. [U][B]It is 95.2% INEFFICIENT![/B][/U][/LIST]
How does that inefficiency delay SEO result for you?[/B]
[LIST][*]Let's say Googlebot crawls 1,000 of your pages every single day.[*]If you enable showpost pages, how long would it take to complete?[/LIST]
[quote]1,050,000 pages / 1,000 Per Day = 1,050 DAYS... [B][COLOR=Blue]OR ALMOST 3 YEARS![/COLOR][/B][/quote]
[LIST][*]If you remove showpost pages, how long would it take to complete?[/LIST]
[quote]50,000 pages / 1,000 Per Day = 50 DAYS... OR [B][COLOR=Blue]JUST UNDER 2 MONTHS![/COLOR][/B][/quote][U][B]Now you can understand the impact of 95.2% inefficiency (in this example) can have on your SEO. Check out some other factors here.[/B][/U]
[B]Formula for Your Forum's Average Crawling Efficiency
[LIST][*]P = Total Posts (Public)[*]T = Total Thread Pages (Including multi-page threads). Get from sitemap generator report.[/LIST]
Find out the total number of thread pages you have (including multi-page threads) from your sitemap generator report (T). Use the following formula:
[SIZE=4][COLOR=Blue][B]Crawling Efficiency = T / (T + P) * 100[/B][/COLOR][/SIZE]
[LIST][*]At vBSEO.com we use 15 posts per page.[*]Our sitemap generator reports 19,571 showthread pages total (public).[*]We'll guess that 60% of our 187,000 posts are public, so 112,200.[/LIST]
[B]Our efficiency is 100%[/B], because [I][U]we replace showpost links with Permalinks[/U][/I].
However, if we did NOT do this, our efficiency would be approximately:
[quote]19,571 / ( 19,571 + 112,200) * 100 = [B][COLOR=Red]14.9% Crawling Efficiency[/COLOR][/B] OR 85.1% INEFFICIENCY.[/quote][B]If Googlebot hits only 1,000 of our pages a day:[/B]
[LIST][*]We would be completely re-indexed with [B][COLOR=SeaGreen]20 DAYS[/COLOR][/B] since we have Permalinks enabled (replacing showpost).[*]If we kept the showpost pages in place, it could take [B][COLOR=Red]4 to 5 Months![/COLOR][/B][/LIST]
Total Trackbacks 0