When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
QUOTE:  Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces). Aaron Wall, 2009
QUOTE: “For instance, we would see a lot of low-quality posts in a forum. We would index those low-quality pages. And we’d also see a lot of really high-quality posts, with good discussions, good information on those pages. And our algorithms would be kind of stuck in a situation with, well, there’s a lot of low-quality content here, but there’s also a lot of high-quality content here. So how should we evaluate the site overall? And usually, what happens is, our algorithms kind of find some middle ground……. what you’d need to do to, kind of, move a step forward, is really try to find a way to analyze the quality of your content, and to make sure that the high-quality content is indexed and that the lower-quality content doesn’t get indexed by default.” John Mueller, Google 2014
QUOTE: “Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. …. Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important..” Google Search Quality Evaluator Guidelines 2019

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.
Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependent on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.

A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.
Every page on your website should have a title, a subtitle and so on. When search engines scan your website, they’ll understand your content better if you explain them the text hierarchy. The most relevant part is the title of your page and you should define it as H1 (in the Text’s Editor). The H1 should be descriptive the page’s content and you shouldn’t have more than one H1 per page. Choose carefully and don’t forget to include your keywords. Following your H1, is H2, H3 and so on. The clearer your text structure is, the easier search engines will digest your site’s content.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
The SEO starter guide describes much of what your SEO will do for you. Although you don't need to know this guide well yourself if you're hiring a professional to do the work for you, it is useful to be familiar with these techniques, so that you can be aware if an SEO wants to use a technique that is not recommended or, worse, strongly discouraged.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.

Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key terms. For example; “SEO Scotland” as opposed to https://www.hobo-web.co.uk or “click here“. Saying that – in 2020, Google is punishing manipulative anchor text very aggressively, so be sensible – and stick to brand mentions and plain URL links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially with links).
Think, that one day, your website will have to pass a manual review by ‘Google’ – the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google – it better ‘do’ something other than exist only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL, to a visitor that Google will send you – and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site – or a ‘thin affiliate’ as Google CLASSIFIES it.
QUOTE: “Cleaning up these kinds of link issue can take considerable time to be reflected by our algorithms (we don’t have a specific time in mind, but the mentioned 6-12 months is probably on the safe side). In general, you won’t see a jump up in rankings afterwards because our algorithms attempt to ignore the links already, but it makes it easier for us to trust the site later on.” John Mueller, Google, 2018
Advertising with Google won't have any effect on your site's presence in our search results. Google never accepts money to include or rank sites in our search results, and it costs nothing to appear in our organic search results. Free resources such as Search Console, the official Webmaster Central blog, and our discussion forum can provide you with a great deal of information about how to optimize your site for organic search.
We all have to delete pages at some point, but when that old URL gets visitors, they bump into a 404 Not Found error. Aaaargh! To avoid this, you can redirect them to a new page with relevant information. It’s important to do this systematically to keep your website healthy. The Redirect manager allows you to do just that: after deleting a post or page, the plugin will ask you what to do with the old URL. You can also go to the menu ‘Redirects’ to see and update all your redirected pages. And you can even set ‘REGEX redirects’ to indicate that all URLs containing a certain word or expression should redirect to the same page.
OBSERVATION – You can have the content and the links – but if your site falls short on even a single user satisfaction signal (even if it is picked up by the algorithm, and not a human reviewer) then your rankings for particular terms could collapse – OR – rankings can be held back – IF Google thinks your organisation, with its resources, or ‘reputation, should be delivering a better user experience to users.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
QUOTE: “The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.” Amit Singhal, Google 2012
QUOTE: “Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started. I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.” Gary Illyes, Google 2017
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.

QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”
Your website is the “hub” of your online brand – so, it’s important to have regular checkups to ensure everything is in order. It’s also important to note that your website is a living digital property, it’s typically not stagnant for long periods of time. In any given year, content is added and/or removed from your site. It is for this reason that audits should occur on a regular basis. We recommend that websites be audited at a minimum of once per year. That allows your teams to fix critical issues as they arise.
×