QUOTE: “Some Low-quality pages have adequate MC (main content on the page) present, but it is difficult to use the MC due to disruptive, highly distracting, or misleading Ads/SC. Misleading titles can result in a very poor user experience when users click a link only to find that the page does not match their expectations.” Google Search Quality Evaluator Guidelines 2015
QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).
QUOTE: “Consider where user-generated content might appear on your site or app, and what risks to your site or app’s reputation might occur from malicious user-generated content. Ensure that you mitigate those risks before enabling user-generated content to appear.Set aside some time to regularly review your top pages with user-generated content. Make sure that what you see complies with all our programme policies.” Google Adsense Policies, 2018
QUOTE: “Google Webmaster Tools notice of detected doorway pages on xxxxxxxx – Dear site owner or webmaster of xxxxxxxx, We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages – groups of “cookie cutter” or low-quality pages. Such pages are often of low value to users and are often optimized for single words or phrases in order to channel users to a single location. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.” Google Search Quality Team, 2011
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
ensure redirected domains redirect through a canonical redirect and this too has any chains minimised, although BE SURE to audit the backlink profile for any redirects you point at a page as with reward comes punishment if those backlinks are toxic (another example of Google opening up the war that is technical seo on a front that isn’t, and in fact is converse, to building backlinks to your site).
QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve it purpose. SC is controlled by webmasters and is an important part of the user experience. One common type of SC is navigation links that allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page. Sometimes the easiest way to identify SC is to look for the parts of the page that are not MC or Ads. ” Google Search Quality Evaluator Guidelines 2019
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.
Brian, I’m going through Step 3, which is referring to the one version of the website. I found a very good free tool (https://varvy.com/tools/redirects/) to recommend. It checks on the redirect and gives you a visual number of hops. More hops mean more delay. For example, if I use your manual method to check on https://uprenew.com, all looks good. However, if I use the tool and check, I realize there is an unnecessary 1 hop/delay, whereby I can fix it. Hope this helps. : )