QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Ask for explanations if something is unclear. If an SEO creates deceptive or misleading content on your behalf, such as doorway pages or "throwaway" domains, your site could be removed entirely from Google's index. Ultimately, you are responsible for the actions of any companies you hire, so it's best to be sure you know exactly how they intend to "help" you. If an SEO has FTP access to your server, they should be willing to explain all the changes they are making to your site.
You don’t have to be better than your competition at absolutely everything, so long as you identify enough points to build a keyword strategy around. For smaller companies, this means that you probably have to be better at the things the bigger fish haven’t thought of or aren’t actively looking to do. If you can’t think of anything at all then you have a much bigger problem than just coming up with keywords…
If you are just starting out, don’t think you can fool Google about everything all the time. Google has VERY probably seen your tactics before. So, it’s best to keep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a healthy, satisfying visitor experience. If you are just starting out – you may as well learn how to do it within Google’s Webmaster Guidelines first. Make a decision, early, if you are going to follow Google’s guidelines, or not, and stick to it. Don’t be caught in the middle with an important project. Do not always follow the herd.

Think, that one day, your website will have to pass a manual review by ‘Google’ – the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google, at least classes even useful sites as spammy, according to leaked documents. If you want a site to rank high in Google – it better ‘do’ something other than exist only link to another site because of a paid commission. Know that to succeed, your website needs to be USEFUL, to a visitor that Google will send you – and a useful website is not just a website, with a sole commercial intent, of sending a visitor from Google to another site – or a ‘thin affiliate’ as Google CLASSIFIES it.

QUOTE – “So it’s not that our systems will look at your site and say oh this was submitted by a user therefore the site owner has like no no control over what’s happening here but rather we look at it and say well this is your website this is what you want to have indexed you kind of stand for the content that you’re providing there so if you’re providing like low quality user-generated content for indexing then we’ll think well this website is about low quality content and spelling errors don’t necessarily mean that it’s low quality but obviously like it can go in all kinds of weird directions with user-generated content…” John Mueller, Google 2019
Ego and assumptions led me to choose the wrong keywords for my own site. How did I spend three years optimizing my site and building links to finally crack the top three for six critical keywords, only to find out that I wasted all that time? However, in spite of targeting the wrong words, Seer grew the business. In this presentation, Will shows you the mistakes made and share with you the approaches that can help you build content that gets you thanked.

Google is all about ‘user experience’ and ‘visitor satisfaction’ in 2020 so it’s worth remembering that usability studies have shown that a good page title length is about seven or eight words long and fewer than 64 total characters. Longer titles are less scan-able in bookmark lists, and might not display correctly in many browsers (and of course probably will be truncated in SERPs).
I added one keyword to the page in plain text because adding the actual ‘keyword phrase’ itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research – and knowing which unique keywords to add.

If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by  tag with rel="canonical" and rel="alternate" elements.

I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.


In the last year, Google and Bing have both indicated a shift to entity-based search results as part of their evolution. Google has unscored this point with rich snippets and Knowledge Graph, and Bing has now upped the ante on personal search results with Bing Snapshots. Find out how you can adopt strategies to stay ahead of the curve in the new world of semantic search results.

Consider the length of your typical customer buying cycle. If your products and services have a short customer buying cycle, meaning your customers know what they want, search for it, and buy it, you may benefit from SEM ads that put your product right where customers will see it. Longer buying cycles, where customers research and compare for weeks or months, may not perform as well with SEM, as there isn’t an immediate buy after seeing one ad.

QUOTE: “For instance, we would see a lot of low-quality posts in a forum. We would index those low-quality pages. And we’d also see a lot of really high-quality posts, with good discussions, good information on those pages. And our algorithms would be kind of stuck in a situation with, well, there’s a lot of low-quality content here, but there’s also a lot of high-quality content here. So how should we evaluate the site overall? And usually, what happens is, our algorithms kind of find some middle ground……. what you’d need to do to, kind of, move a step forward, is really try to find a way to analyze the quality of your content, and to make sure that the high-quality content is indexed and that the lower-quality content doesn’t get indexed by default.” John Mueller, Google 2014

Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
I prefer simple SEO techniques and ones that can be measured in some way. I have never just wanted to rank for competitive terms; I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. I try to create a good user experience for humans AND search engines. If you make high-quality text content relevant and suitable for both these audiences, you’ll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly URLs.

QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018
SEM is better for testing than SEO. Because you can immediately turn SEM paid ads off and on, it’s a great strategy for testing. You can quickly revise your ad copy, target new audiences, and change landing page content to test your new tactics. This flexibility allows you to see differences in your strategies immediately. You cannot accomplish this through SEO, as it would take too much time to make changes and monitor differences in results.

QUOTE: “They follow the forms you gather data you do so and so and so forth but they don’t get any laws they don’t haven’t found out anything they haven’t got anywhere yet maybe someday they will but it’s not very well developed but what happens is an even more mundane level we get experts on everything that sound like this sort of scientific expert they they’re not scientist is a typewriter and they make up something.”  Richard Feynman, Physicist 1981

In an evolving mobile-first web, we can utilize pre-empting solutions to create winning value propositions, which are designed to attract and satisfy search engine crawlers and keep consumers happy. I'll outline a strategy and share tactics that help ensure increased organic reach, in addition to highlighting smart ways to view data, intent, consumer choice theory and crawl optimization.


If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
×