After trying a lot (10+ years of experience) SE ranking stands out on top of others because it combines everything we need for our clients. We do only provide the client with rankings, but also with the potential traffic (and revenue) of those ranking when they hit top 3 in Google. The tool let us provide the client with in depth analysis of the technical stuff ánd a marketing plan tool, so we can set goals and follow a checklist of monthly activities. And to top it all off it’s fully whitelabel.
All you need to notice from this kind of articles is what I & most of the others newbies focusing on the SEO link-building. I have seen many bloggers spending time on different ways of SEO link building instead of providing the value to the content and its social promotions. You may call it ignoring the Google, but we all know that the Google bot doesn’t ignore anchored dofollow or nofollow backlinks to calculate your PageRank.
Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before auto-generating keyword phrase footprints across a site using boiler-plating and article spinning techniques. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2020.

As of October 2018 almost 4.2 billion people were active internet users and 3.4 billion were social media users (Statista). China, India and the United States rank ahead all other countries in terms of internet users. This gives a marketer an unprecedented number of customers to reach with product and service offerings, available 24 hours a day, seven days a week. The interactive nature of the internet facilitates immediate communication between businesses and consumers, allowing businesses to respond quickly to the needs of consumers and changes in the marketplace. "Everyone has the internet in their pocket all the time, and this changes everything for small businesses" (Cory Capoccia, Forbes).
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” Paul Haahr, Google 2016

Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo


The above information does not need to feature on every page, more on a clearly accessible page. However – with Google Quality Raters rating web pages on quality based on Expertise, Authority and Trust (see my recent making high-quality websites post) – ANY signal you can send to an algorithm or human reviewer’s eyes that you are a legitimate business is probably a sensible move at this time (if you have nothing to hide, of course).
QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
Online reviews have become one of the most important components in purchasing decisions by consumers in North America. According to a survey conducted by Dimensional Research which included over 1000 participants, 90% of respondents said that positive online reviews influenced their buying decisions and 94% will use a business with at least four stars. Interestingly, negative reviews typically came from online review sites whereas Facebook was the main source of positive reviews. Forrester Research predicts that by 2020, 42% of in-store sales will be from customers who are influenced by web product research.
It's wonderful to deal with keywords that have 50,000 searches a month, or even 5,000 searches a month, but in reality, these popular search terms only make up a fraction of all searches performed on the web. In fact, keywords with very high search volumes may even indicate ambiguous intent, which, if you target these terms, it could put you at risk for drawing visitors to your site whose goals don't match the content your page provides.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
You don’t have to be better than your competition at absolutely everything, so long as you identify enough points to build a keyword strategy around. For smaller companies, this means that you probably have to be better at the things the bigger fish haven’t thought of or aren’t actively looking to do. If you can’t think of anything at all then you have a much bigger problem than just coming up with keywords…

QUOTE: “Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.” John Mueller, Google 2013


Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.
QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.
QUOTE: “So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” Google 2012
A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.

Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.


QUOTE:  Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces). Aaron Wall, 2009
There are some basic keyword usage rules you should follow to get started. Unique keywords should be employed on each page of your site in the areas that bots and humans normally look to reassure them that you have what they're after. This includes both the title tag and the body of your content, which leads to an important point: the pitfalls of clickbait. You may believe you're enticing more clicks by offering tantalizingly vague titles for your content, but by disguising what the page is actually about, you're opting out of some of the power of keywords.
QUOTE: “So there’s three things that you really want to do well if you want to be the world’s best search engine you want to crawl the web comprehensively and deeply you want to index those pages and then you want to rank or serve those pages and return the most relevant ones first….. we basically take PageRank as the primary determinant and the more PageRank you have that is the more people who link to you and the more reputable those people are the more likely it is we’re going to discover your page…. we use page rank as well as over 200 other factors in our rankings to try to say okay maybe this document is really authoritative it has a lot of reputation because it has a lot of PageRank … and that’s kind of the secret sauce trying to figure out a way to combine those 200 different ranking signals in order to find the most relevant document.” Matt Cutts, Google
Critics will point out the higher the cost of expert SEO, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.
QUOTE: “So if you have different parts of your website and they’re on different subdomains that’s that’s perfectly fine that’s totally up to you and the way people link across these different subdomains is really up to you I guess one of the tricky aspects there is that we try to figure out what belongs to a website and to treat that more as a single website and sometimes things on separate subdomains are like a single website and sometimes they’re more like separate websites for example on on blogger all of the subdomains are essentially completely separate websites they’re not related to each other on the other hand other websites might have different subdomains and they just use them for different parts of the same thing so maybe for different country versions maybe for different language versions all of that is completely normal.” John Mueller 2017
QUOTE: “Keyword Stuffed” Main Content Pages may be created to lure search engines and users by repeating keywords over and over again, sometimes in unnatural and unhelpful ways. Such pages are created using words likely to be contained in queries issued by users. Keyword stuffing can range from mildly annoying to users, to complete gibberish. Pages created with the intent of luring search engines and users, rather than providing meaningful MC to help users, should be rated Lowest.” Google Search Quality Evaluator Guidelines 2017
Google Ads (formerly Google Adwords) is the search provider most commonly used for this strategy. With this tactic, brands conduct keyword research and create campaigns that target the best keywords for their industry, products, or services. When users search for those keywords, they see the custom ads at the top or bottom of SERPs. The brand is charged each time a user clicks on the ad.

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.

This helpful tool scans your backlink profile and turns up a list of contact information for the links and domains you'll need to reach out to for removal. Alternatively, the tool also allows you to export the list if you wish to disavow them using Google's tool. (Essentially, this tool tells Google not to take these links into account when crawling your site.)


The transparency you provide on your website in text and links about who you are, what you do, and how you’re rated on the web or as a business is one way that Google could use (algorithmically and manually) to ‘rate’ your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.
Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
QUOTE: The manual actions team… can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labelled, then they might actually take a deeper look and see what the content owner is trying to do.” Gary Illyes, Google 2016
QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).

It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.

Both require knowing your audience. To succeed at both strategies, you must have a good understanding of your audience and how they act. By using buyer personas and psychographic segmentation, you can get to know your audience, discover what their needs are, and what they are searching for. Then you can create valuable content that shows up when they go looking for solutions related to your brand.
An SEO meta description is a brief description of content found on a web page. The meta description is shown to users in search engine results pages to help them decide which search result to click on. Meta descriptions are not a ranking factor in the eyes of a search engine, but they can influence how many people click on a result -- which directly affects the result's organic ranking.
Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.
Search engine optimization is a method for sustainably influences search engine rankings. Google and other search engines calculate their search results for keywords using highly complex algorithms. The individual ranking factors and their weighting within the ranking calculation are well-guarded intellectual property that belongs to the search engines and is not publicly disclosed.
Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.
QUOTE: “The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well…. Important: The Low rating should be used if the page has Low quality MC. ” Google Search Quality Evaluator Guidelines, 2019
QUOTE: “How do I move from one domain to another domain and try to preserve the rankings as best as possible?…do a 301 permanent redirect to the new location (assuming that you’re you’re moving for all time and eternity so this is the good case for a permanent or 301 redirect if you were planning to undo this later or it’s temporary then you’d use a 302 redirect)…. search engines should be able to follow the trail of all the 301 redirects” Matt Cutts, Google
Google expects pages to “be edited, reviewed, and updated on a regular basis” especially if they are for important issues like medical information, and states not all pages are held to such standards, but one can expect that Google wants information updated in a reasonable timescale. How reasonable this is, is dependent on the TOPIC and the PURPOSE of the web page RELATIVE to competing pages on the web.
QUOTE: “Google Webmaster Tools notice of detected doorway pages on xxxxxxxx – Dear site owner or webmaster of xxxxxxxx, We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages – groups of “cookie cutter” or low-quality pages. Such pages are often of low value to users and are often optimized for single words or phrases in order to channel users to a single location. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.” Google Search Quality Team, 2011

In Chapter 2, we learned about SERP features. That background is going to help us understand how searchers want to consume information for a particular keyword. The format in which Google chooses to display search results depends on intent, and every query has a unique one. Google describes these intents in their Quality Rater Guidelines as either “know” (find information), “do” (accomplish a goal), “website” (find a specific website), or “visit-in-person” (visit a local business).


Additionally, there are many situations where PPC (a component of SEM) makes more sense than SEO. For example, if you are first launching a site and you want immediate visibility, it is a good idea to create a PPC campaign because it takes less time than SEO, but it would be unwise to strictly work with PPC and not even touch search engine optimization.
QUOTE: “While, as a whole, web usability has improved over these past several years, history repeats and designers make the same mistakes over and over again. Designers and marketers continuously need to walk a line between providing a good user experience and increasing advertising revenue. There is no “correct” answer or golden format for designers to use in order to flawlessly reach audiences; there will inevitably always be resistance to change and a desire for convention and predictability. That said, if, over the course of over ten years, users are still lamenting about the same problems, it’s time we start to take them seriously.”  Therese Fessenden, Nielsen Norman Group 2017
Keyword research is your first step in optimizing your website for certain keywords. Without keyword research, you might find yourself lost in your own lingo and battling giants in your industry that can’t be beaten in the search result pages just like that. There is a variety of factors you have to take into account when doing keyword research and setting up your keyword strategy. In this article, we’ll discuss your mission, your audience, and your competition.
The world of digital marketing is as complex as it is exciting. Young entrepreneurs get into the digital marketing business as freelancers, writers, and web designers. Small businesses use lean digital marketing strategies to grow their presence online and gain new customers. SMBs become massively successful corporations by harnessing the power of enterprise digital marketing. In the modern age, every online entity must have a digital marketing strategy to be competitive in the internet marketplace. However, digital marketing trends are always changing. What works today may be obsolete tomorrow. Furthermore, the popular terms of the month can seem outdated in the next.
QUOTE: “An infinite number of niches are waiting for someone to claim them. I’d ask yourself where you want to be, and see if you can find a path from a tiny specific niche to a slightly bigger niche and so on, all the way to your desired goal. Sometimes it’s easier to take a series of smaller steps instead of jumping to your final goal in one leap.” Matt Cutts, Google 2006
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
QUOTE: “I’ve got a slide here where I show I think 8 different URLs you know every single one of these URLs could return completely different content in practice we as humans whenever we look at ‘www.example.com’ or just regular ‘example.com’ or example.com/index or example.com/home.asp we think of it as the same page and in practice it usually is the same page so technically it doesn’t have to be but almost always web servers will return the same content for like these 8 different versions of the URL so that can cause a lot of problems in search engines if rather than having your backlinks all go to one page instead it’s split between (the versions) and it’s a really big headache….how do people fix this well …. the canonical link element” Matt Cutts, Google 2009
A lot of optimisation techniques that are in the short term effective at boosting a site’s position in Google are against Google’s guidelines. For example, many links that may have once promoted you to the top of Google, may, in fact, today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* won’t have too much trouble with, in the FUTURE. Because they will punish you in the future.
The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.

And you’ve got to diversify. Here’s a tongue-twister that’s absolutely true: diversity is a key word in the keyword world. You’re not going to stand out if you find yourself using all of the same keywords as your competitors. Not only should you try new keyword search tools and keep track of the results, but you should feel free to experiment based on your own research – who else uses your keywords? And how do you make yourself stand out? By providing great content that truly answers the questions your prospective customers are asking with their keyword searches.

Google Ads (formerly Google Adwords) is the search provider most commonly used for this strategy. With this tactic, brands conduct keyword research and create campaigns that target the best keywords for their industry, products, or services. When users search for those keywords, they see the custom ads at the top or bottom of SERPs. The brand is charged each time a user clicks on the ad.


QUOTE: The manual actions team… can look at the labels on the on the links or a site gets. Basically, we have tons of link labels; for example, it’s a footer link, basically, that has a lot lower value than an in-content link. Then another label would be a Penguin real-time label. If they see that most of the links are Penguin real-time labelled, then they might actually take a deeper look and see what the content owner is trying to do.” Gary Illyes, Google 2016
×