QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
While Google never sells better ranking in our search results, several other search engines combine pay-per-click or pay-for-inclusion results with their regular web search results. Some SEOs will promise to rank you highly in search engines, but place you in the advertising section rather than in the search results. A few SEOs will even change their bid prices in real time to create the illusion that they "control" other search engines and can place themselves in the slot of their choice. This scam doesn't work with Google because our advertising is clearly labeled and separated from our search results, but be sure to ask any SEO you're considering which fees go toward permanent inclusion and which apply toward temporary advertising.
Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
ensure redirected domains redirect through a canonical redirect and this too has any chains minimised, although BE SURE to audit the backlink profile for any redirects you point at a page as with reward comes punishment if those backlinks are toxic (another example of Google opening up the war that is technical seo on a front that isn’t, and in fact is converse, to building backlinks to your site).
Brian, I’m going through Step 3, which is referring to the one version of the website. I found a very good free tool (https://varvy.com/tools/redirects/) to recommend. It checks on the redirect and gives you a visual number of hops. More hops mean more delay. For example, if I use your manual method to check on https://uprenew.com, all looks good. However, if I use the tool and check, I realize there is an unnecessary 1 hop/delay, whereby I can fix it. Hope this helps. : )
However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.

Flash is a propriety plug-in created by Macromedia to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website.
QUOTE: “Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high-quality search results that are optimized for their devices”. GOOGLE
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.
The reality in 2020 is that if Google classifies your duplicate content as THIN content, or MANIPULATIVE BOILER-PLATE or NEAR DUPLICATE ‘SPUN’ content, then you probably DO have a severe problem that violates Google’s website performance recommendations and this ‘violation’ will need ‘cleaned’ up – if – of course – you intend to rank high in Google.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
QUOTE: “If an off-domain link is made by an anonymous or unauthenticated user, I’d use nofollow on that link. Once a user has done a certain number of posts/edits, or has been around for long enough to build up trust, then those nofollows could be removed and the links could be trusted. Anytime you have a user that you’d trust, there’s no need to use nofollow links.” Matt Cutts, Google 2006

QUOTE: “We do use it for ranking, but it’s not the most critical part of a page. So it’s not worthwhile filling it with keywords to hope that it works that way. In general, we try to recognise when a title tag is stuffed with keywords because that’s also a bad user experience for users in the search results. If they’re looking to understand what these pages are about and they just see a jumble of keywords, then that doesn’t really help.” John Mueller, Google 2016
QUOTE: “So it’s not something where we’d say, if your website was previously affected, then it will always be affected. Or if it wasn’t previously affected, it will never be affected.… sometimes we do change the criteria…. category pages…. (I) wouldn’t see that as something where Panda would say, this looks bad.… Ask them the questions from the Panda blog post….. usability, you need to work on.“ John Mueller, Google.
The Java program is fairly intuitive, with easy-to-navigate tabs. Additionally, you can export any or all of the data into Excel for further analysis. So say you're using Optify, Moz, or RavenSEO to monitor your links or rankings for specific keywords -- you could simply create a .csv file from your spreadsheet, make a few adjustments for the proper formatting, and upload it to those tools.
Ever heard of Maslow's hierarchy of needs? It's a theory of psychology that prioritizes the most fundamental human needs (like air, water, and physical safety) over more advanced needs (like esteem and social belonging). The theory is that you can't achieve the needs at the top without ensuring the more fundamental needs are met first. Love doesn't matter if you don't have food.

Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
QUOTE: “You always need textual content on-page, regardless of what other kinds of content you might have. If you’re a video-hosting site, you still need things like titles, headings, text, links, etc. The same goes for audio-hosting sites. Make it easy for search engines to understand your content & how it’s relevant to users, and they’ll be able to send you relevant traffic. If you make it hard for search engines to figure out what your pages are about, it would be normal for them to struggle to figure out how your site is relevant for users.” John Mueller, Google 2019

QUOTE: “The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.” Amit Singhal, Google 2012

While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)
NOTE, in 2020, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY & DEVICE dependent these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.
The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.
QUOTE: “Some Low-quality pages have adequate MC (main content on the page) present, but it is difficult to use the MC due to disruptive, highly distracting, or misleading Ads/SC. Misleading titles can result in a very poor user experience when users click a link only to find that the page does not match their expectations.” Google Search Quality Evaluator Guidelines 2015
Google will select the best title it wants for your search snippet – and it will take that information from multiple sources, NOT just your page title element. A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with a colon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to).
NOTE, in 2020, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY & DEVICE dependent these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
QUOTE: “Keyword Stuffed” Main Content Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
We all have to delete pages at some point, but when that old URL gets visitors, they bump into a 404 Not Found error. Aaaargh! To avoid this, you can redirect them to a new page with relevant information. It’s important to do this systematically to keep your website healthy. The Redirect manager allows you to do just that: after deleting a post or page, the plugin will ask you what to do with the old URL. You can also go to the menu ‘Redirects’ to see and update all your redirected pages. And you can even set ‘REGEX redirects’ to indicate that all URLs containing a certain word or expression should redirect to the same page.
QUOTE: “If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove those links from the other site. If you can’t get these links removed, then you should disavow those links to your website.“ Google Webmaster Guidelines 2020
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.
Understand and accept why Google ranks your competition above you – they are either: more relevant and more popular, more relevant and more reputable, better user experience or manipulating backlinks better than you. Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete – relying on Google to take action on your behalf is VERY probably not going to happen.
Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
For centuries, the myth of the starving artist has dominated our culture, seeping into the minds of creative people and stifling their pursuits. But the truth is that the world’s most successful artists did not starve. In fact, they capitalized on the power of their creative strength. In Real Artists Don’t Starve, Jeff Goins debunks the myth of the starving artist by unveiling the ideas that created it and replacing them with fourteen rules for artists to thrive.
Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google in 2020.

QUOTE: “You heard right there is no duplicate content penalty… I would avoid talking about duplicate content penalties for spam purposes because then it’s not about duplicate content then it’s about generating in a very often in an automated way content that is not so much duplicated as kind of fast and scraped from multiple places and then potentially probably monetized in some way or another and it just doesn’t serve any purpose other than to gain traffic redirect traffic maybe make some money for the person who created it this is not about content that gets really sort of created for any kind of reason other than just to be there so I person I don’t think about it as duplicate content there is just this disparity it’s the same thing as saying that maybe we created gibberish is some sort of a duplicate content of words but otherwise also exists like I I would really separate those two issues because the things that Aamon led in with URLs generated on WordPress and the things they were going to discuss today later maybe about e-commerce sites and so on I wouldn’t want people to confuse those two.”  Andrey Lipattsev, Google 2016


Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
The actual content of your page itself is, of course, very important. Different types of pages will have different “jobs” – your cornerstone content asset that you want lots of folks to link to needs to be very different than your support content that you want to make sure your users find and get an answer from quickly. That said, Google has been increasingly favoring certain types of content, and as you build out any of the pages on your site, there are a few things to keep in mind:
“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][52] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[53] although the two are not identical.


Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters and SEO is that Google doesn’t want business owners to rank for lots of keywords using auto-generated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.
When using the Keyword Explorer, Ahrefs will also produce the "parent topic" of the keyword you looked up, as you can see in the screenshot above, underneath the Keyword Difficulty meter. A keyword's parent topic is a broader keyword with higher search volume than your intended keyword, but likely has the same audience and ranking potential -- giving you more a valuable SEO opportunity when optimizing a particular blog post or webpage.
Our Insights functionality lists the 5 words or word combinations that appear most often on your page. That way, you can check if the topics and keywords you’d like to rank for in Google, Bing and Yahoo are in line with what you’re actually writing about. This is a Premium feature. In the free version, you’ll have to do your own content analysis, we’re afraid. Currently, this feature is only available in selected languages.
×