Understanding How PageRank Decays & The Importance of Depth
PageRank, named after Larry Page co-founder of Google who started developing it during his time at Stanford is the backbone of Google’s algorithm. We know that the methods for calculating PageRank have certainly expanded over the years and by increasingly addressing the failures of PageRank, Google has been able to remain the most successful search engine ever created. As SEO’s I find it very common that people misunderstand PageRank or have gaps in their understanding of it.
If you don’t already know and are familiar with how PageRank works, the best place to start is by reading and understanding the papers published on Stanford’s website. Today, PageRank is much more complex than the original research papers still available through Stanford’s site: http://ilpubs.stanford.edu:8090/422/. It’s obvious that what was published there has grown and become more advanced in the 15 years since they were first written. With that said, when you start with what we know about PageRank then combine that with theory to how it’s used and evolved much can be learned.
The most common element of PageRank that people don’t seem to understand is what is commonly referred to as ‘Page Depth’. The depth is determined by the minimum number of clicks from the homepage required to reach that page. Developing a complete understanding of depth is absolutely one of the most significant basis of knowledge for SEO and yet it’s widely underestimated and left by the wayside.
To demonstrate depth and its importance, let’s investigate the following table, consisting of a site that has 1,000,000 hypothetical inbound PageRank to the homepage. Each page has 50 links, a small number when considering the complexity of most sites today:
|Depth||Links||PR Passed||Decay Factor|
The table shows both a raw ‘PR passed value’ as well as a Decay Factor value as is typically described as a 10-15% loss of PageRank through each link. If you notice, by the time you reach 4 clicks from the homepage or 4 depth, your PageRank has dwindled into nothing. It’s become massively depleted. This is significant because most sites will have pages that are 4 or more clicks from the homepage, in many cases such as with ecommerce sites, your most important landing pages may be 4 or more layers from the homepage.
If the depth of pages is so strongly relevant to the PageRank they receive then how can we reduce depth and maximize our internal PageRank for the best results? There are 2 concepts I’d like to discuss here, because they cover the most common scenarios. I’ll be writing from the perspective of an ecommerce site with categories and product pages but these principles are universally applicable. Think of Category pages as those that serve as a funnel to hundreds of additional pages, or products. Typically Categories are your level 1 or level 2 depth pages that are either receiving links from the homepage or an ‘All categories’ type page included in the site navigation. These concepts apply to any site where indexes become large enough that you can’t just make simple changes like adding links directly to your most important landing pages from your homepage.
Pagination – Dated or alphabetized paginated pages are one of the most common ways that category pages will link to product pages. Category pages might include a categorization of our products or organize by date. Pagination also creates massive depth on sites that dilutes the value that could be focused on the product pages should it be handled appropriately.
The problem with Pagination
Pagination is like PageRank kryptonite. Here’s why:
The table above shows a category page that has 4 products and paginated links at the bottom. A total of 10 links on the page means that each link is receiving 10 PR. Each of the 4 products is receiving 10 PR, not bad but what about the products on page 2,3,4,5 and so on? With the 2nd page receiving only 10 PR, each of the products on that page will only receive 1 PR. Once you have pages like page 4 that don’t receive a link from page 1, the PageRank drops significantly, in this case, it would only receive .1 PR for each product page.
From a baseline you probably want to have all of your pages have the same PR but if you have really hot products you’re desperate to get more traction, feature them on your homepage, this will ensure they’re getting the most visibility possible. Another option is to create a ‘Hot Deals’ page, where you feature your best products, add the Hot Deals to your navigation and your most important pages will get more and more PR with each new link you create to their product pages.
But how do you create an even baseline for all of your products. What if you have 500 products with images and content that would slow the page load down too significantly for users to tolerate? The best option is to use AJAX. Yep, an SEO just told you to use AJAX!
The first place to start learning about how to correctly implement AJAX for SEO is to read the following Google post: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
What AJAX allows you to do is offer the pleasing user experience of either vertical lazy load scrolling where products appear on the page as the user scrolls down or more common ‘View 10,20,50 type options that alter the page length on demand. This ensures fast page load as well as a terrific familiar feel to the site. With the HTML snapshot you generate for Google, you then provide the maximum products visible to the user. This enables you to place 100 products or more on a page from an SEO perspective optimizing the PR passed to those products while retaining the great user experience you need.
Another terrific and more simple option for pagination that can work well if the paginated pages are relatively few (10 or less) is to use the rel=next & rel=prev tags to identify the pages as a paginated series. Essentially, google tells the take the contents of the pagination and combine them onto a single page for a presumably similar effect as the AJAX solution provided above. The complexity of this approach is much lower, so take a look at this as an option as well: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html