If you link out to irrelevant sites, Google may ignore the page, too – but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER – I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites, for example, don’t do well in Google these days without some good quality backlinks and higher quality pages.
QUOTE: “Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted (also, you probably don’t want your site to rank well for the search query” GOOGLE
QUOTE: “Many SEOs and other agencies and consultants provide useful services for website owners, including: Review of your site content or structure – Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript – Content development – Management of online business development campaigns – Keyword research – SEO training – Expertise in specific markets and geographies.” Google Webmaster Guidelines, 2020

Many think that Google won’t allow new websites to rank well for competitive terms until the web address “ages” and acquires “trust” in Google – I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A “honeymoon period” to give you a taste of Google traffic, perhaps, or a period to better gauge your website quality from an actual user perspective.
Today, however, SEM is used to refer exclusively to paid search. According to Search Engine Land, Search Engine Marketing is “the process of gaining website traffic by purchasing ads on search engines.” Search Engine Optimization, on the other hand, is defined as “the process of getting traffic from free, organic, editorial or natural search results.”
QUOTE: “What makes a page spammy?: “Hidden text or links – may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSS/Javascript, or viewing source code. Sneaky redirects – redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100% frame. Keyword stuffing – no percentage or keyword density given; this is up to the rater. PPC ads that only serve to make money, not help users. Copied/scraped content and PPC ads. Feeds with PPC ads. Doorway pages – multiple landing pages that all direct user to the same destination. Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations. Copied message boards with no other page content. Fake search pages with PPC ads. Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content. Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question. Pure PPC pages with little to no content. Parked domains” Miranda Miller, SEW, 2011
If a PARTICULAR CANONICAL HEAD KEYWORD is IMPORTANT (even perhaps a SYNONYM or LONG TAIL VARIANT) and I think a particular 301 REDIRECT has some positive impact on how Google judges the quality or relevance the page, I will make sure the CANONICAL HEAD KEYWORD and SYNONYMS are on the FINAL PAGE I redirect Google to (which is the one that will be rated and cached).
A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.
QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Google will select the best title it wants for your search snippet – and it will take that information from multiple sources, NOT just your page title element. A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with a colon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to).
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
QUOTE: “There’s probably always gonna be a little bit of room for keyword research because you’re kind of providing those words to users. And even if search engines are trying to understand more than just those words, showing specific words to users can make it a little bit easier for them to understand what your pages are about and can sometimes drive a little bit of that conversion process.  So I don’t see these things going away completely but I’m sure search engines will get better over time to understand more than just the words on a page.” John Mueller, Google 2020
QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018

QUOTE: “If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove those links from the other site. If you can’t get these links removed, then you should disavow those links to your website.“ Google Webmaster Guidelines 2020


A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
QUOTE: “Expertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. …. Remember that the first step of PQ rating is to understand the true purpose of the page. Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating. For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important..” Google Search Quality Evaluator Guidelines 2019

If a PARTICULAR CANONICAL HEAD KEYWORD is IMPORTANT (even perhaps a SYNONYM or LONG TAIL VARIANT) and I think a particular 301 REDIRECT has some positive impact on how Google judges the quality or relevance the page, I will make sure the CANONICAL HEAD KEYWORD and SYNONYMS are on the FINAL PAGE I redirect Google to (which is the one that will be rated and cached).


I do not obsess about site architecture as much as I used to…. but I always ensure my pages I want to be indexed are all available from a crawl from the home page – and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links – but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance..
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
×