Call Us


Say Hello

Please fill in the form below and one of our specialists will be in touch with you, alternatively contact us on:


    Your message has been sent successfully.

    We have received your enquiry and if it’s something we can help with, we’ll get back in touch with you as soon as possible.

    4 mins

    Website Navigation

    31 January 2023

    highlight 1

    Website navigation is one of the most important factors for both SEO and user experience (UX). A well-organised, well-designed website with straightforward navigation will not only improve your site’s UX, but will also help the search engine bots crawl a website by helping them find all the site’s content and making more efficient use of their crawl budgets.

    Factors to consider when assessing and improving website navigation include:

    • Site depth
    • Site architecture
    • URL structure
    • Breadcrumb navigation
    • Internal search
    • Including a sitemap and using robots.txt
    • Design/UX

    The basis of creating a well organised website is providing the optimal UX and creating a logical hierarchical structure for content.

    Site depth

    There are two basic types of structure: deep and shallow. On deep-structured websites, content is far from the homepage and users need to click multiple links to get from the home page to the page they’re searching for.

    This can make it confusing for a user to reach pages and revisit them. Deep content is also less likely to be reached by the search engine’s index during a crawl. We know that the number of clicks a web pages is from the home page is a ranking factor, so it makes sense to have your most important content relatively few clicks away from your home page!

    A shallow site structure allows users access to the majority of the website’s content within two or three clicks. This makes the website succinct and easy to use, and also makes the job of the search spider easy, as they are not searching through every nook and cranny for elusive pages.

    Even if your site hosts a great deal of content, there is almost always a simple method of laying it out. Amazon’s a great example of this. Millions of products, all of which can be filtered and found within a few simple clicks! However, thought should be given to the most efficient way to structure your website. Shallower is usually better.

    Site architecture

    Site architecture refers to the way a website’s content is organised. There’s a fair amount of evidence that Google prefers a site where content is arranged by topic areas. This is sometimes called content siloing and can provide a more logical system for users (and search engine spiders) to explore the site with. Google is quick to reward topic authority, so the more relevant content you can include under the same silo, the better.

    This could take the form of, for instance, adding new FAQ pages within a silo on a related topic rather than creating a separate FAQ section on the site where all FAQs across different topics are stored. This is particularly beneficial if your site’s content is so broad that a deep site navigation system is all but unavoidable.

    URL structure

    It’s also best practice to ensure that all of your URLs are logically named. Many content management systems automatically suggest web addresses, and some are done in a numerical format based on how many pages have gone before it. Making your URL relevant to the content of the page will improve its rankings, and improve UX by making it obvious to users what they will be reading when they click the link.

    At Go Up, we’ve seen examples where the only change that’s been made to a page, or set of pages, has been to optimise their URLs. And these pages have shot up the rankings as a result. Be cautious. Google has always said that changing URLs can have a huge number of benefits, but can also have drawbacks. So make sure to get advice from your SEO agency on this!

    Optimising your URLs, and putting some keywords in there, can also help users better understand the navigation of the site, improve UX and help with click through rates (URLs feature prominently in the SERPs, so, if you have a good looking URL, that page is more likely to get clicked on).

    Breadcrumb navigation

    Alongside URL structure, breadcrumb navigation is the most effective way to help a the user know where they are on a site, breaking down the path taken to reach the current page. Breadcrumbs can be organised either by following the URL path, or by the architecture of your site, as outlined above. For example, if a user has reached our ‘SEO Pricing’ page, the breadcrumb would look as follows:

    Homepage – Web Marketing – SEO — SEO pricing

    Internal search

    Many websites use specialist internal search forms, which index all of the pages of a specific website, and give the user the option to search for any pages within that website. They can be extremely useful for content rich websites, in which users want to quickly find the answer or page they want without having to work out which navigation path to find it in.

    Google specifically tells web publishers not to have pages that are only accessible through an internal search engine, so make sure that all pages are accessible through the site’s navigational structure.  Also, Google tells web publishers not to allow these internal search pages to be crawled by Google, and recommends using robots.txt disallow to prevent Google bot from crawling said internal search pages. Google says:


    • Letting your internal search result pages be crawled by Google. Users dislike clicking a search engine result only to land on another search result page on your site.”

    We’ve seen lots and lots and lots of websites get heavily penalised, either manually or algorithmically, because they allowed Google to index/crawl their internal search pages!

    Include a sitemap and robots.txt

    A sitemap is a list of all of the pages on your website, and is usually in either .html and .xml formats. The former is a comprehensive navigation system for users, though if your site’s shallow, it’s not as essential as an .xml sitemap. These are submitted to Google via Search Console, and create the starting point for a search engine spider’s crawl of your site. Your sitemap should omit pages which you don’t want spiders to crawl.

    Robots.txt is a file uploaded to a website which tells crawlers the best way to crawl the site. It highlights the location of the sitemap, and can also instruct crawlers on certain pages of the site to avoid indexing, such as images or videos, backing up the exclusions from your sitemap.


    This may sound obvious. But UX is a major ranking factor. You want your main site’s navigation to be intuitive for the user, with good, clean design. Make sure the code isn’t glitchy. So, for instance, if the navigation bar is a hover over, make sure that it doesn’t disappear if you hover over certain parts of it.

    A good, clean design will not only help Google understand that navigation structure, but give the user a more pleasant experience and help push them towards the bottom of the conversion funnel, improving your site’s conversion rate optimisation. On the other hand, if your navigation design is clunky, it can decrease your engagement rate, which, in turn, could result in your rankings suffering.