In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
The depth of your articles impresses and amazes me. I love all the specific examples and tool recommendations. You discuss the importance of backlinks. How important is it to use a tool to list you on directories (Yext, Moz Local, Synup or JJUMP)? Will Google penalize you for listing on unimportant directories? Is it better to avoid these tools and get backlinks one at a time and avoid all but a few key directories?
QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013
QUOTE: “I’ve got a slide here where I show I think 8 different URLs you know every single one of these URLs could return completely different content in practice we as humans whenever we look at ‘www.example.com’ or just regular ‘example.com’ or example.com/index or example.com/home.asp we think of it as the same page and in practice it usually is the same page so technically it doesn’t have to be but almost always web servers will return the same content for like these 8 different versions of the URL so that can cause a lot of problems in search engines if rather than having your backlinks all go to one page instead it’s split between (the versions) and it’s a really big headache….how do people fix this well …. the canonical link element” Matt Cutts, Google 2009
ensure redirected domains redirect through a canonical redirect and this too has any chains minimised, although BE SURE to audit the backlink profile for any redirects you point at a page as with reward comes punishment if those backlinks are toxic (another example of Google opening up the war that is technical seo on a front that isn’t, and in fact is converse, to building backlinks to your site).
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
For instance, in a recent test (2019), if you have a page title longer than 12 words, all the keywords beyond 12 words evaporated from the page copy. This is a change from the way Google used to work, when the extra words were included as part of the page copy, not just part of the title. So, if you have a 15 word title, the last 3 words will not count towards ranking, if that test result was to be replicated.
Your website can show the same content on various URLs, which might confuse Google: this is called a duplicate content issue. Yoast SEO solves this by letting you indicate one URL as the original one – what techies like to call a canonical link. That way, Yoast SEO makes sure that your content is always found under the URL you want it to be found under. Simple.

If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2020 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.


In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]

QUOTE: “The easiest thing is think about what people are gonna type to want to find your site so we had somebody come to a conference and say I want to show up for (Keyword) and we said well have you put the (Keyword) on your page anywhere at all and he hadn’t and sometimes just very simple stuff like that really helps think about what users are gonna type to find your page and then just make sure those keywords are on the page.” Matt Cutts, Google 2013
A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
ensure redirected domains redirect through a canonical redirect and this too has any chains minimised, although BE SURE to audit the backlink profile for any redirects you point at a page as with reward comes punishment if those backlinks are toxic (another example of Google opening up the war that is technical seo on a front that isn’t, and in fact is converse, to building backlinks to your site).
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
If Google finds two identical pieces of content, whether on your own site, or on another you’re not even aware of, it will only index one of those pages. You should be aware of scraper sites, stealing your content automatically and republishing as your own. Here’s Graham Charlton’s thorough investigation on what to if your content ends up working better for somebody else.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites – those with online business authority – know about it, and they might link to you – this is called a quality backlink.
QUOTE: “What makes a page spammy?: “Hidden text or links – may be exposed by selecting all page text and scrolling to the bottom (all text is highlighted), disabling CSS/Javascript, or viewing source code. Sneaky redirects – redirecting through several URLs, rotating destination domains cloaking with JavaScript redirects and 100% frame. Keyword stuffing – no percentage or keyword density given; this is up to the rater. PPC ads that only serve to make money, not help users. Copied/scraped content and PPC ads. Feeds with PPC ads. Doorway pages – multiple landing pages that all direct user to the same destination. Templates and other computer-generated pages mass-produced, marked by copied content and/or slight keyword variations. Copied message boards with no other page content. Fake search pages with PPC ads. Fake blogs with PPC ads, identified by copied/scraped or nonsensical spun content. Thin affiliate sites that only exist to make money, identified by checkout on a different domain, image properties showing origination at another URL, lack of original content, different WhoIs registrants of the two domains in question. Pure PPC pages with little to no content. Parked domains” Miranda Miller, SEW, 2011

Check what your page will look like if you share it on Facebook or Twitter. Do the title fit and the image match? Will it stand out? If you want people to click, your social media shares have to look the part. If you don’t want to think about that for every single post, just set a template. The Yoast SEO WordPress plugin allows you to set those for all types of pages.
There are other parts of SEO which you should pay attention to after your audit to make sure you stay competitive. After all, the technical foundation isn't the end of the road for SEO success. It's important to pay attention to your competition's SEO activity, keep an eye on the newest search engine best practices, and maintain local SEO best practices if your business depends on customers visiting a physical address. All of these are elements of a successful SEO strategy and should be corollary to your audit and ongoing SEO maintenance.
×