NAP (Name, Address, Phone Number)
NAP stands for name, address, phone number. For a search engine, these collective details form a unique marker that separates you from anyone else. The more you can solidify your identity, the better. Through listing your NAP details consistently, you demonstrate your presence in a certain area, and most importantly, your right to rank for location-based searches performed there.
SEO cocitation refers to the symbiosis that occurs when websites discuss interrelated themes, concepts and mention each other. An important aspect of cocitation is the significance (for Google) of the words that surround links. Cocitation first prompted discussion years ago when Google’s anti-spam updates prevented low-quality blog networks from impacting on search rankings.
Vertical engines are search engines that specialise in different types of search. In the generic Google model, examples of verticals would be Google’s image search, location or map search, news search, and web search. In many respects these should be considered to be separate search engines, all within the Google umbrella.
Announced by Google in 2011, Schema markup is form of a microdata that was established in collaboration by the search giants Google, Bing, Yahoo! and Yandex. The aim has been to develop a specific vocabulary of tags that enable webmasters to communicate the meaning of web pages to computer programs that read them, including search engines.
Early in 2011, Google launched Panda, a search results algorithm which filtered out websites with thin, low quality content. This was the start of a series of major quality control checks. Google Panda stripped search results pages (SERPs) of poorly constructed, spammy content, enabling higher quality websites to rise to the top.
Syndication Source Tag
In 2010, Google introduced two new meta tags for news articles: syndication-source and original-source. These tags were designed to enable news curators to publish another site’s article on their own site, without risking a Google penalty for duplicate content or plagiarism. The canonical tag now performs this function.
Keywords And Keyword Research
From the earliest stages of your SEO campaign, you’ll be aware of keywords. These are single words or phrases which you want search engines to associate with your website, so that when those keywords are entered in search by a user, your site is visible. These phrases should be central to your business, and highly relevant to the service and information provided by your website.
Also known as ‘rich results’, rich answers are becoming increasingly important to SEO as Google users are coming to rely on them for both quick answers to quick questions and breakdowns of complex topics. Google is constantly shifting and evolving, its developers coming up with new and intuitive ways to display answers to users’ search queries.
Google Analytics defines bounce rate as the percentage of sessions in which a user left your site from the page through which they entered it, without interacting with it. Not only does a high bounce rate indicate that a business has lost a number of conversion opportunities, but there’s been evidence to suggest that high bounce rates can damage a website’s search visibility.
User reviews are an extremely powerful aspect of SEO and digital marketing. Not only is Google more likely to reward websites with positive user reviews with better rankings, but reviews can also act as a powerful promotional tool. 92% of consumers will read a review before they buy a product or service.
Local search is a constantly evolving field in search. For business which need to draw customers to a physical location, ranking well in local search can be a critical consideration in any decent marketing strategy, and has swiftly become an SEO top-priority. Local search results are those which are relevant to a user based on their current location, or on the location they type manually into search.
Spam refers to a broad range of unwanted pop-ups, links, data and emails that we face in our daily interactions on the web. Spam’s namesake is, (now unpopular) luncheon meat that was often unwanted but ever present. Spam can be simply unwanted, but it can also be harmful, misleading and problematic for your website in a number of ways.
URL parameters are used to indicate how search engines should handle parts of your site based on your URLs, in order to crawl your site more efficiently. This refers to folders within a URL string, i.e. yoursite.com/folder-one/ or yousite.com/folder-two/, where folder one may have duplicate content to folder two or where the content in folder one should not be showing up in search results.
Hummingbird is a search algorithm used by Google. It was first introduced in August 2013, to replace the previous Caffeine algorithm, and affects about 90% of Google searches. Hummingbird is a brand new engine, but one that continues to use some of the same parts of the old, like Panda and Penguin. In terms of what Google is trying to do with this new engine, very little has changed—the focus is still on quality.
A branded keyword, or a branded search, is any query via a search engine that includes the name of your company, business or brand. Whether you’re a new company or a big fish, you’ll want to rank number one for your branded search. But it isn’t always easy to be the top listing, even for your own brand.
Rel=Prev and Rel=Next
These functions were introduced by Google in September 2011 to help tackle the problem of duplicate content. The Rel=Prev and Rel=Next are added to the HTML code of a website in order to let search engines know that a certain collection of consecutive pages should all be indexed together.
A crawler is the name given to a program used by search engines that traverses the internet in order to collect and index data. A crawler will visit a site via a hyperlink. The crawler then reads the site’s content and embedded links before following the links away from the site. The crawler continues this process until it has visited and indexed data from every single website that has a link to another site. It essentially crawls the web, hence the name.
Meta Robots Tags
Robots.txt and meta robots tags are used by webmasters and search engine optimisation agencies in order to give instructions to crawlers traversing and indexing a website. They tell the search spider what to do with the specific web page, this may include requesting that the spider does not crawl the page at all or crawls the page but does not include it in Google’s index.
Whether your site needs to be moved to another server or your company is rebranding and renaming itself, your onsite content needs to be precisely redirected to its new home. If your redirects are not properly completed, your new site will be riddled with 404 File Not Found errors, which can diminish user experience, leading to an increased bounce rate and therefore a decreased page ranking.
First Click Free
First Click Free is a Google tool that allows Google bots to crawl and index content held behind forms, predominantly on subscription or registration-only sites (ie. those with paywalls) such as The Times or The New York Times. Introduced in 2008, it allows Google to gain access to behind-form content and allows these pages to appear in search engine results for relevant queries.
White Hat vs Black Hat SEO
The terms ‘white hat’ and ‘black hat’ are commonplace in SEO. Put simply, white hat refers to good SEO practices that search engines recommend, while black hat refers to bad SEO practices that break search engine rules. Though black hat techniques may appear to lead to quick and easy boosts in search rankings, they never pay off in the long run.
Accelerated Mobile Pages (AMP)
Google’s AMP is an open source project designed to improve the loading speed and readability of content for mobile users. Think of it as an upgrade to mobile-friendly pages. According to Google, AMP aims to improve mobile experience by getting information to load instantaneously, including rich content like video, animations and graphics.
Directories were once an SEO staple. Conceived as the web’s answer to the Yellow Pages, online directories played host to huge lists of links which would send link juice to the sites they listed, boosting the Google rankings of the pages they linked to. Google did not like this.
Google Penguin is an algorithm update that was first announced in April 2012. The update aimed to reduce web spam, penalising websites that violated Google’s Webmaster Guidelines by using black hat techniques to obtain links and manipulate search engine rankings. It also rewarded websites with high quality link profiles.
The Knowledge Graph is Google’s system for organising information about millions of well-known entities: people, places, and organisations, to build a map of how information is interconnected. It’s a knowledge base used by Google to enhance its search results with the use of human language technology and the semantic web.
Anchor text is the descriptive text of an outbound link. It is clickable, and it’s readable to both the user and to search engines. How this link is described in the anchor text is considered to be one of the top three ranking factors, and remains an integral part of any content marketing or SEO campaign.
In a computing context, cache refers to the temporary storing of data, usually for purposes of fast retrieval upon a second load. In search specifically, “cache” is a reference to a web cache, usually HTML pages and images that are stored either by the browser or the search engine to reduce bandwidth.