QUOTE: “News and current events: news about important topics such as international events, business, politics, science, technology, etc. Keep in mind that not all news articles are necessarily considered YMYL (e.g., sports, entertainment, and everyday lifestyle topics are generally not YMYL). Please use your judgment and knowledge of your locale. ● Civics, government, and law: information important to maintaining an informed citizenry, such as information about voting, government agencies, public institutions, social services, and legal issues (e.g., divorce, child custody, adoption, creating a will, etc.). ● Finance: financial advice or information regarding investments, taxes, retirement planning, loans, banking, or insurance, particularly webpages that allow people to make purchases or transfer money online. ● Shopping: information about or services related to research or purchase of goods/services, particularly webpages that allow people to make purchases online. ● Health and safety: advice or information about medical issues, drugs, hospitals, emergency preparedness, how dangerous an activity is, etc. ● Groups of people: information about or claims related to groups of people, including but not limited to those grouped on the basis of race or ethnic origin, religion, disability, age, nationality, veteran status, sexual orientation, gender or gender identity. ● Other: there are many other topics related to big decisions or important aspects of people’s lives which thus may be considered YMYL, such as fitness and nutrition, housing information, choosing a college, finding a job, etc.” Google Search Quality Evaluator Guidelines 2019
QUOTE: “When our quality algorithms go to your website, and they see that there’s some good content here on this page, but there’s some really bad or kind of low quality content on the bottom part of the page, then we kind of have to make a judgment call on these pages themselves and say, well, some good, some bad. Is this overwhelmingly bad? Is this overwhelmingly good? Where do we draw the line?” John Mueller, Google 2016
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.

QUOTE: “To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices. If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.” Doantam Phan, Google 2017

QUOTE: “To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices. If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.” Doantam Phan, Google 2017
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
QUOTE: The manual actions team… can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labelled, then they might actually take a deeper look and see what the content owner is trying to do.” Gary Illyes, Google 2016
If you take money online, in any way, you NEED to have an accessible and satisfying ‘customer service’ type page. Google says, “Contact information and customer service information are extremely important for websites that handle money, such as stores, banks, credit card companies, etc. Users need a way to ask questions or get help when a problem occurs. For shopping websites, we’ll ask you to do some special checks. Look for contact information—including the store’s policies on payment, exchanges, and returns. “ Google urges quality raters to be a ‘detective’ in finding this information about you – so it must be important to them.

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
QUOTE: “What makes a page spammy?: “Hidden text or links – may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSS/Javascript, or viewing source code. Sneaky redirects – redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100% frame. Keyword stuffing – no percentage or keyword density given; this is up to the rater. PPC ads that only serve to make money, not help users. Copied/scraped content and PPC ads. Feeds with PPC ads. Doorway pages – multiple landing pages that all direct user to the same destination. Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations. Copied message boards with no other page content. Fake search pages with PPC ads. Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content. Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question. Pure PPC pages with little to no content. Parked domains” Miranda Miller, SEW, 2011
Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.

QUOTE: “While, as a whole, web usability has improved over these past several years, history repeats and designers make the same mistakes over and over again. Designers and marketers continuously need to walk a line between providing a good user experience and increasing advertising revenue. There is no “correct” answer or golden format for designers to use in order to flawlessly reach audiences; there will inevitably always be resistance to change and a desire for convention and predictability. That said, if, over the course of over ten years, users are still lamenting about the same problems, it’s time we start to take them seriously.”  Therese Fessenden, Nielsen Norman Group 2017


SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.

This helpful tool scans your backlink profile and turns up a list of contact information for the links and domains you'll need to reach out to for removal. Alternatively, the tool also allows you to export the list if you wish to disavow them using Google's tool. (Essentially, this tool tells Google not to take these links into account when crawling your site.)

You might have heard about Google Panda and Google Penguin – two major updates to Google’s search algorithm. Google also makes minor changes to its algorithm 500-600 times a year. We update the Yoast SEO plugin every 2 weeks. That way you’re sure that your website is optimized for Google’s most recent version of its algorithm. Updates and upgrades on our Premium plugin are only available with a valid subscription.
I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2020.
Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
You might have heard about Google Panda and Google Penguin – two major updates to Google’s search algorithm. Google also makes minor changes to its algorithm 500-600 times a year. We update the Yoast SEO plugin every 2 weeks. That way you’re sure that your website is optimized for Google’s most recent version of its algorithm. Updates and upgrades on our Premium plugin are only available with a valid subscription.

QUOTE: “So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” Google 2012
QUOTE: “Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.” John Mueller, Google 2013
In the free version, you can only set 1 focus keyword or keyphrase. That means you’ll have to choose what exactly to optimize for. In Yoast SEO Premium, you have the option to add more related keywords. This means, that when you want to optimize for pen, you can also optimize for paper in the same text. You will be able reach a broader audience because of it. In both versions, Yoast SEO doesn’t only help you boost your ranking in the search results, but also your morale!
Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
When using the Keyword Explorer, Ahrefs will also produce the "parent topic" of the keyword you looked up, as you can see in the screenshot above, underneath the Keyword Difficulty meter. A keyword's parent topic is a broader keyword with higher search volume than your intended keyword, but likely has the same audience and ranking potential -- giving you more a valuable SEO opportunity when optimizing a particular blog post or webpage.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][52] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[53] although the two are not identical.
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
QUOTE: “Anytime you do a bigger change on your website if you redirect a lot of URLs or if you go from one domain to another or if you change your site’s structure then all of that does take time for things to settle down so we can follow that pretty quickly we can definitely forward the signals there but that doesn’t mean that’ll happen from one day to next” John Mueller, Google 2016
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Ask for a technical and search audit for your site to learn what they think needs to be done, why, and what the expected outcome should be. You'll probably have to pay for this. You will probably have to give them read-only access to your site on Search Console. (At this stage, don't grant them write access.) Your prospective SEO should be able to give you realistic estimates of improvement, and an estimate of the work involved. If they guarantee you that their changes will give you first place in search results, find someone else.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
An authority website is a site that is trusted by its users, the industry it operates in, other websites and search engines. Traditionally a link from an authority website is very valuable, as it’s seen as a vote of confidence. The more of these you have, and the higher quality content you produce, the more likely your own site will become an authority too.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
QUOTE: “Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve it purpose. SC is controlled by webmasters and is an important part of the user experience. One common type of SC is navigation links that allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page. Sometimes the easiest way to identify SC is to look for the parts of the page that are not MC or Ads. ” Google Search Quality Evaluator Guidelines 2019
QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018
In Yoast SEO you can enter the keyword or keyphrase you’d like your post or page to rank for in the search results. We then run a check on your content to check whether you’re using the keywords often enough – but not too often – and in the right spots. In Yoast SEO Premium, we take this one major step further. We will not only check for the keyword or keyphrase you enter, but we will analyze your text on all variations of that word or phrase. If the keyphrase is “room decorating ideas” and the synonym is “apartment decorating ideas” all the possible word forms of each word are taken into account: ‘apartments’, ‘rooms’, ‘room’s’, ‘decorate’, ‘decorates’, ‘decorated’, ‘idea’, etc. This way, you are truly optimizing your text. With this addition in Yoast SEO Premium, we can guarantee that no other WordPress plugin comes closer to how Google will see your text*! * We are constantly working on perfecting this for all available languages
×