QUOTE: “You always need textual content on-page, regardless of what other kinds of content you might have. If you’re a video-hosting site, you still need things like titles, headings, text, links, etc. The same goes for audio-hosting sites. Make it easy for search engines to understand your content & how it’s relevant to users, and they’ll be able to send you relevant traffic. If you make it hard for search engines to figure out what your pages are about, it would be normal for them to struggle to figure out how your site is relevant for users.” John Mueller, Google 2019
Your site’s URL structure can be important both from a tracking perspective (you can more easily segment data in reports using a segmented, logical URL structure), and a shareability standpoint (shorter, descriptive URLs are easier to copy and paste and tend to get mistakenly cut off less frequently). Again: don’t work to cram in as many keywords as possible; create a short, descriptive URL.
I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2020.
Your website can show the same content on various URLs, which might confuse Google: this is called a duplicate content issue. Yoast SEO solves this by letting you indicate one URL as the original one – what techies like to call a canonical link. That way, Yoast SEO makes sure that your content is always found under the URL you want it to be found under. Simple.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2020.
It’s important to note that Google is responsible for the majority of the search engine traffic in the world. This may vary from one industry to another, but it’s likely that Google is the dominant player in the search results that your business or website would want to show up in, but the best practices outlined in this guide will help you to position your site and its content to rank in other search engines, as well.
The errors in technical SEO are often not obvious, and therefore one of the most popular. Mistakes in robots.txt and 404 pages, pagination and canonical URLs, hreflang tags and 301 redirects, http vs https and www vs non www versions: each of them can seriously spoil all efforts to promote the site. One quality SEO website analysis is enough to solve all the main problems in this part forever.
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.

QUOTE: “Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site’s user experience and performance in organic search results.” Google Starter Guide, 2020
QUOTE: “Keep in mind that there are high E-A-T pages and websites of all types, even gossip websites, fashion websites, humor websites, forum and Q&A pages, etc. In fact, some types of information are found almost exclusively on forums and discussions, where a community of experts can provide valuable perspectives on specific topics. ● High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis. ● High E-A-T news articles should be produced with journalistic professionalism—they should contain factually accurate content presented in a way that helps users achieve a better understanding of events. High E-A-T news sources typically have published established editorial policies and robust review processes (example 1, example 2). ● High E-A-T information pages on scientific topics should be produced by people or organizations with appropriate scientific expertise and represent well-established scientific consensus on issues where such consensus exists. ● High E-A-T financial advice, legal advice, tax advice, etc., should come from trustworthy sources and be maintained and updated regularly. ● High E-A-T advice pages on topics such as home remodeling (which can cost thousands of dollars and impact your living situation) or advice on parenting issues (which can impact the future happiness of a family) should also come from “expert” or experienced sources that users can trust. ● High E-A-T pages on hobbies, such as photography or learning to play a guitar, also require expertise. Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc. These ordinary people may be considered experts in topics where they have life experience. If it seems as if the person creating the content has the type and amount of life experience to make him or her an “expert” on the topic, we will value this “everyday expertise” and not penalize the person/webpage/website for not having “formal” education or training in the field.” Google Search Quality Evaluator Guidelines 2019

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]


Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
Internal linking is extremely important because it helps search engines to understand your site structure. But adding links by hand is slow, tedious and downright annoying. With the internal linking feature in Yoast SEO Premium, that’s a thing of the past. This feature analyzes your text and suggests links to related posts – as you write. Taking all the hassle out of internal linking. Yeah, we’re pretty happy about it too.
You can probably include up to 12 words that will be counted as part of a page title, and consider using your important keywords in the first 8 words. The rest of your page title historically counted as normal text on the page, but recent tests (2019) would indicate this is no longer the case. Words beyond the 12th word may be completely ignored by Google.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.

QUOTE: “For instance, we would see a lot of low-quality posts in a forum. We would index those low-quality pages. And we’d also see a lot of really high-quality posts, with good discussions, good information on those pages. And our algorithms would be kind of stuck in a situation with, well, there’s a lot of low-quality content here, but there’s also a lot of high-quality content here. So how should we evaluate the site overall? And usually, what happens is, our algorithms kind of find some middle ground……. what you’d need to do to, kind of, move a step forward, is really try to find a way to analyze the quality of your content, and to make sure that the high-quality content is indexed and that the lower-quality content doesn’t get indexed by default.” John Mueller, Google 2014
When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) – and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to doorway page type situations. A highly relevant unique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Every page on your website should have a title, a subtitle and so on. When search engines scan your website, they’ll understand your content better if you explain them the text hierarchy. The most relevant part is the title of your page and you should define it as H1 (in the Text’s Editor). The H1 should be descriptive the page’s content and you shouldn’t have more than one H1 per page. Choose carefully and don’t forget to include your keywords. Following your H1, is H2, H3 and so on. The clearer your text structure is, the easier search engines will digest your site’s content.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “SEO Scotland” as opposed to https://www.hobo-web.co.uk or “click here“. Saying that – in 2020, Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links).
Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
People are searching for any manner of things directly related to your business. Beyond that, your prospects are also searching for all kinds of things that are only loosely related to your business. These represent even more opportunities to connect with those folks and help answer their questions, solve their problems, and become a trusted resource for them.
For me, when SEO is more important than branding, the company name goes at the end of the tag, and I use a variety of dividers to separate as no one way performs best. If you have a recognisable brand – then there is an argument for putting this at the front of titles – although Google often will change your title dynamically – sometimes putting your brand at the front of your snippet link title itself. I often leave out branding. There is no one size fits all approach as the strategy will depend on the type of page you are working with.
The ranking of your website is partly decided by on-page factors. On-page SEO factors are all those things you can influence from within your actual website. These factors include technical aspects (e.g. the quality of your code and site speed) and content-related aspects, like the structure of your website or the quality of the copy on your website. These are all crucial on-page SEO factors.
×