Controlling internal link juice
As discussed in Link Metrics, link juice is the amount of positive ranking factor that a link passes from one page to the next. A very loose way of understanding it is like this: Google determines how authoritative your website is based on a number of factors, neatly summed up as E-E-A-T (Experience-Expertise-Authority-Trustworthiness).
One of the more important of these factors is a page’s link based metrics. There are many factors that go into link metrics. In the SEO industry, we often refer to an all encompassing aggregator of these metrics called ‘Domain Rank’ (DR), ‘Domain Authority’ (DA), or ‘Page Authority’ (PA). It is worth noting that Domain Authority is not an official metric used by Google, but a best-guess made up by the SEO industry. It’s based on the total amount of positive link metrics associated with a website as a whole.
The Domain Rank is governed by the combined PageRank of all of the individual pages on a website. Every web page will have its own PageRank, or perhaps more correctly ranking capacity, governed by the link metrics associated with each particular page. It is the link metrics of any given page that determines how much Link Juice any one web page can pass to another. Read more about this in Link Metrics.
So, if a web page from, say, BBC News links to you, that page will probably have a high page authority. This is because millions of websites link to BBC News when they cite articles that have been written by the famous news outlet. These millions of links pass a huge amount of link juice to BBC News, which then passes this link juice through its website via its internal linking structure. Because of that, each page on BBC News should have a decent page authority. So, when any one page from BBC News links to you, it will pass a decent amount of link juice to your website. Websites with lower authority than BBC News, say, a local blog, will still pass link juice to you, just no so much. So, as a very generic rule, the ranking factor that is passed via a link from BBC News is far greater than the ranking factor passed through the links from a local blog.
Link juice should be spread as evenly throughout a website as possible. Each important web page should have a good number of links pointing its way. Many web publishers make the mistake of concentrating all of their link juice on their home page, with very small amounts trickling deep into the site. Obtaining deeper links was always a good idea, but, for many years (dating back to the dawn of Google Panda) it has now become an absolute must.
So, here are the Go Up tips on link juice distribution best practice:
1. Don’t concentrate your link metrics on your home page
This is a very common mistake, made by in house SEO teams and external SEO agencies alike. In their link building campaigns, when people are requesting links to their website from other websites, they often request links only to their home page. This can make the home page rank highly in the search results, but prevents other pages from ranking as well. It can also result in an unnatural and artificial looking link profile, which is a strategy often employed by black hat SEOs and can look a bit spammy. In some cases, websites are actively penalised for having too high a concentration of links pointing to their home page.
Make an effort to weave incoming links throughout your web pages, not concentrating too much on any particular one. This is called ‘deep linking’, and is a positive ranking factor. Remember: if a website has a balanced link juice distribution, every page on the website will have a good chance of ranking well. This is by far the most effective way to target both your short tail and long tail keyword phrases.
In your link building campaign, make a table of how many links you have requested to different pages. Below are two examples of good inbound link distribution and poor inbound link distribution. Each table assumes that the web publisher has a website with 10 pages, and is requesting 30 links. It is on a sliding scale, so page 10 is the least important web page, and the home page is the most important.
Poor inbound link distribution
Home page 20 links
- Page b, 6 links
- Page c, 1 link
- Page d, 1 link
- Page e, 0 links
- Page f, 2 links
- Page g, 0 links
- Page h ,0 links
- Page i, 0 links
- Page j, 0 links
Good inbound link distribution
Home page 6 links
- Page b, 4 links
- Page c, 3 link
- Page d, 3 link
- Page e, 2 links
- Page f, 2 links
- Page g, 3 links
- Page h, 2 links
- Page i, 3 links
- Page j, 2 links
2. Pass link juice from high performing pages to low performing pages
Link juice can be passed internally. If there’s a web page that’s performing particularly well (lots of visitors, a high browse time, well designed and lots of high quality inbound links), it’s able to pass link juice to a web page that’s performing poorly.
To pass link juice between two pages on your website, simply place a link between them.
There are exceptions, though. If it is a low priority page such as a ‘contacts’ page, it’s best not to waste any good link juice on it. Only pass link juice to a page if doing so makes search sense. A contacts page is never going to rank for any meaningful transactional or informational keywords, so don’t try to make it, and don’t waste any good link juice on it.
3. Identify low-value pages on your website
As mentioned above, there are pages that are exceptions to the ‘spread link juice evenly rule’. If a web page is going to provide you with no benefits in the Search Engines Results Pages, then it is better to tell the search spiders not to visit these pages or pass them any link juice. They will take up valuable link juice and search engine crawl allocations. Search bots will only crawl a certain, non pre-defined number of pages per website per visit.
This varies from website to website. If they spend their visits crawling low-value pages, you have wasted an opportunity to have optimised pages crawled.
Usually, these low value web pages include:
- Contact page
- About page
- Pricing page
- Meet the Team page
- Terms and Conditions
- Thin content pages
- Any other low quality pages, or pages that violate Google’s Search Essentials Policies or Spam Policies.
4. Use Robots Tags, Robots.txt and NoFollow Tags
Using Robots Tags and Robots.txt is a great way to prevent search spiders from crawling and indexing low-value pages. Place the robot in the HTML of the page, instructing the search engine to either not crawl the page altogether or crawl but not index the page. You should also use the NoFollow Tag to instruct the search engines to not pass any link juice onto the page from any linking pages. NoFollow is not as effective as it once was, as the search engines now often ignore it and follow the link anyway.
5. Make sure that no page is more than three clicks from the home page
If your site is under 800 pages, no single web page should be more than three clicks away from the home page. If it is over 800 pages, no page should be more than four clicks away from the home page. As the home page is often considered the most authoritative page on a website, search engines often interpret distance from the home page as a signal of page importance – the further away, the less importance. This affects search visibility.
6. Include a site map on your home page
A site map is a great way to show the search engines all of the pages of your website, one click from the home page.
7. Consider using a sill structure
To help Google understand the hierarchy of your content, and thus allow them to reward the most important pages of your site, and to give users the best possible user experience, we recommend considering a silo structure to your site content. This also helps ensure that your lower priority pages are actively pumping your higher priority pages with link juice.