QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”
Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.

Love how you just dive into the details for this Site Audit guide. Excellent stuff! Yours is much much easier to understand than other guides online and I feel like I could integrate this to how I site audit my websites and actually cut down the time I make my reports. I only need to do more research on how to remove “zombie pages”. If you could have a ste-by-step guide to it, that would be awesome! Thanks!
For instance, in a recent test (2019), if you have a page title longer than 12 words, all the keywords beyond 12 words evaporated from the page copy. This is a change from the way Google used to work, when the extra words were included as part of the page copy, not just part of the title. So, if you have a 15 word title, the last 3 words will not count towards ranking, if that test result was to be replicated.

QUOTE: “The preferred domain is the one that you would liked used to index your site’s pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL (for instance, http://www.example.com and http://example.com). The preferred domain is the version that you want used for your site in the search results.” Google, 2018

QUOTE: “An infinite number of niches are waiting for someone to claim them. I’d ask yourself where you want to be, and see if you can find a path from a tiny specific niche to a slightly bigger niche and so on, all the way to your desired goal. Sometimes it’s easier to take a series of smaller steps instead of jumping to your final goal in one leap.” Matt Cutts, Google 2006
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
You can probably include up to 12 words that will be counted as part of a page title, and consider using your important keywords in the first 8 words. The rest of your page title historically counted as normal text on the page, but recent tests (2019) would indicate this is no longer the case. Words beyond the 12th word may be completely ignored by Google.
The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
This broken-link checker makes it easy for a publisher or editor to make corrections before a page is live. Think about a site like Wikipedia, for example. The Wikipedia page for the term "marketing" contains a whopping 711 links. Not only was Check My Links able to detect this number in a matter of seconds, but it also found (and highlighted) seven broken links.
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.
QUOTE: “Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started. I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.” Gary Illyes, Google 2017
NOTE, in 2020, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY & DEVICE dependent these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
QUOTE: “Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.” Google, 2015

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
QUOTE: “Keyword Stuffed” Main Content Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017

QUOTE: “7.4.3 Automatically ­Generated Main Content Entire websites may be created by designing a basic template from which hundreds or thousands of pages are created, sometimes using content from freely available sources (such as an RSS feed or API). These pages are created with no or very little time, effort, or expertise, and also have no editing or manual curation. Pages and websites made up of auto­generated content with no editing or manual curation, and no original content or value added for users, should be rated Lowest.” Google Search Quality Evaluator Guidelines, 2017

When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) – and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to doorway page type situations. A highly relevant unique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.
QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.

Optimizing a website may involve editing its content, adding content, and modifying HTML and associated coding to both increase its relevance to specific keywords and remove barriers to the indexing activities of search engines like Google ,Yahoo etc.[citation needed] Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3]
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.

Love how you just dive into the details for this Site Audit guide. Excellent stuff! Yours is much much easier to understand than other guides online and I feel like I could integrate this to how I site audit my websites and actually cut down the time I make my reports. I only need to do more research on how to remove “zombie pages”. If you could have a ste-by-step guide to it, that would be awesome! Thanks!
×