Big sites can rank for the most general terms. Smaller sites within a very specific niche can do the same. Of course, it’s also easier if you’re writing in a language that is not spoken all over the world. For most smaller sites that are writing in English, however, the general rule of thumb is this: start with a big set of long tail keywords which have little traffic, but you can rank for more easily. Then, work yourself up to the rankings step-by-step. Once you’ve gained some SEO authority, start optimizing for more general keywords. And in the end, maybe you will even be able to rank for your head keywords!
Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2020.

QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
You can go all overboard and make a thorough analysis of all the competitors in your field, and that can certainly be worthwhile. But let’s stick to the basics for now. It’s actually quite easy to get a general idea of your SEO competition. Just Google some search terms you would like to rank for and see which companies show up and compare them with where your site ranks. How big are the companies you are competing with for those top three rankings? Would your company fit within these results? This is all quite easy to determine using only Google search results.
As keywords define each page of your site, you can use them to organize your content and formulate a strategy. The most basic way to do this is to start a spreadsheet (your "content to keyword map") and identify your primary keyword for each article. You can then build your sheet to your own requirements, add keyword search volume, organic traffic, page authority and any other metrics that are important to your business.
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.
If you don't know the difference between head terms and long-tail keywords, let me explain. Head terms are keywords phrases that are generally shorter and more generic -- they're typically just one to three words in length, depending on who you talk to. Long-tail keywords, on the other hand, are longer keyword phrases usually containing three or more words.
If a PARTICULAR CANONICAL HEAD KEYWORD is IMPORTANT (even perhaps a SYNONYM or LONG TAIL VARIANT) and I think a particular 301 REDIRECT has some positive impact on how Google judges the quality or relevance the page, I will make sure the CANONICAL HEAD KEYWORD and SYNONYMS are on the FINAL PAGE I redirect Google to (which is the one that will be rated and cached).
A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Google states, “News articles, Wikipedia articles, blog posts, magazine articles, forum discussions, and ratings from independent organizations can all be sources of reputation information” but they also state specifically boasts about a lot of internet traffic, for example, should not influence the quality rating of a web page. What should influence the reputation of a page is WHO has shared it on social media etc. rather than just raw numbers of shares. CONSIDER CREATING A PAGE with nofollow links to good reviews on other websites as proof of excellence.
Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
It makes sense, even in 2020 that this process of labeling is how you create a Search Engine Results Page out of pages Pagerank 2020 identifies, identify spam, identify monetisation trends and promote content first pages and user friendly content above others. You can also imagine that over time, Google should get a lot better at working out quality SERPs for its users, as it identifies more and more NEGATIVE ranking signals, thereby floating higher quality pages to the top as a second order effect. A end-result of this could be that Google gets an amazing SERP for its users.
The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.

Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
Let's say, for example, you're researching the keyword "how to start a blog" for an article you want to create. "Blog" can mean a blog post or the blog website itself, and what a searcher's intent is behind that keyword will influence the direction of your article. Does the searcher want to learn how to start an individual blog post? Or do they want to know how to actually launch a website domain for the purposes of blogging? If your content strategy is only targeting people interested in the latter, you'll need to make sure of the keyword's intent before committing to it.
QUOTE: “… it also includes things like the comments, includes the things like the unique and original content that you’re putting out on your site that is being added through user-generated content, all of that as well. So while I don’t really know exactly what our algorithms are looking at specifically with regards to your website, it’s something where sometimes you go through the articles and say well there is some useful information in this article that you’re sharing here, but there’s just lots of other stuff happening on the bottom of these blog posts. When our algorithms look at these pages, in an aggregated way across the whole page, then that’s something where they might say well, this is a lot of content that is unique to this page, but it’s not really high quality content that we want to promote in a very visible way. That’s something where I could imagine that maybe there’s something you could do, otherwise it’s really tricky I guess to look at specific changes you can do when it comes to our quality algorithms.” John Mueller, Google 2016
Many search engine marketers think who you link out to (and who links to you) helps determine a topical community of sites in any field or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never actually seen any granular ranking benefit (for the page in question) from linking out.
An SEO meta description is a brief description of content found on a web page. The meta description is shown to users in search engine results pages to help them decide which search result to click on. Meta descriptions are not a ranking factor in the eyes of a search engine, but they can influence how many people click on a result -- which directly affects the result's organic ranking.
If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2020 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.
Repeat this exercise for as many topic buckets as you have. And remember, if you're having trouble coming up with relevant search terms, you can always head on over to your customer-facing colleagues -- those who are in Sales or Service -- and ask them what types of terms their prospects and customers use, or common questions they have. Those are often great starting points for keyword research.
The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
QUOTE: “If an off-domain link is made by an anonymous or unauthenticated user, I’d use nofollow on that link. Once a user has done a certain number of posts/edits, or has been around for long enough to build up trust, then those nofollows could be removed and the links could be trusted. Anytime you have a user that you’d trust, there’s no need to use nofollow links.” Matt Cutts, Google 2006
After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
To get even more insight and data to help you make those decisions, sign up for a free trial of Alexa’s Advanced Plan. You’ll get access to tools that help you research competitor search and link building strategies, find keyword opportunities, review your site’s SEO, and learn about your target audience. These insights, paired with what you know about SEM and SEO, will help you uncover the best search marketing strategy for your unique brand and goals.
QUOTE: “The duration performance scores can be used in scoring resources and websites for search operations. The search operations may include scoring resources for search results, prioritizing the indexing of websites, suggesting resources or websites, protecting particular resources or websites from demotions, precluding particular resources or websites from promotions, or other appropriate search operations.” A Panda Patent on Website and Category Visit Durations

Understand and accept why Google ranks your competition above you – they are either: more relevant and more popular, more relevant and more reputable, better user experience or manipulating backlinks better than you. Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete – relying on Google to take action on your behalf is VERY probably not going to happen.
Hi Kai! Using synonyms now and then is a great idea. It definitely makes your copy more natural and pleasant to read. And Google will comprehend that you’re talking about the same topic. You might want to use the keyword you’d like to rank for a bit more often than just once or twice, but don’t fret to use synonyms now and then. We’re not able to detect them with our tool yet, though.
In the last year, Google and Bing have both indicated a shift to entity-based search results as part of their evolution. Google has unscored this point with rich snippets and Knowledge Graph, and Bing has now upped the ante on personal search results with Bing Snapshots. Find out how you can adopt strategies to stay ahead of the curve in the new world of semantic search results.
Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.
The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.
QUOTE: “Anytime you do a bigger change on your website if you redirect a lot of URLs or if you go from one domain to another or if you change your site’s structure then all of that does take time for things to settle down so we can follow that pretty quickly we can definitely forward the signals there but that doesn’t mean that’ll happen from one day to next” John Mueller, Google 2016
Basically, SEO keyword research should be an ongoing and ever-evolving part of your job as a marketer. Old keywords need to be reevaluated periodically, and high-volume, competitive keywords (or “head” keywords, as opposed to long-tailed keywords) can often be usefully replaced or augmented with longer, more specific phrases designed not to bring in just any visitor but exactly the right visitors. (Who visits your site – particularly if they’re people who are actively looking for your services – is at least as important as how many people visit.)

I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
×