Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
In an evolving mobile-first web, we can utilize pre-empting solutions to create winning value propositions, which are designed to attract and satisfy search engine crawlers and keep consumers happy. I'll outline a strategy and share tactics that help ensure increased organic reach, in addition to highlighting smart ways to view data, intent, consumer choice theory and crawl optimization.

Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.


Clear view of rankings and postions, site audit tool for quick scan and backlink checker are very usefull. I use it a lot and also use the lead generator to get a free scan for potential clients wich runs automated when they fill in te form. The dashboard gives you a good view of changes in traffic and positions. The marketing plan is i bit simple but it gives you some direction of what to do first on the website and you can also check the boxes when you finished a task wich works very well

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
QUOTE: “The quality of the MC is an important consideration for PQ rating. We will consider content to be Low quality if it is created without adequate time, effort, expertise, or talent/skill. Pages with low quality MC do not achieve their purpose well…. Important: The Low rating should be used if the page has Low quality MC. ” Google Search Quality Evaluator Guidelines, 2019
QUOTE: “If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove those links from the other site. If you can’t get these links removed, then you should disavow those links to your website.“ Google Webmaster Guidelines 2020
Social Media Marketing: In an increasingly connected world where consumers expect the companies they do business with to engage with them, social media is an ideal way to interact with prospects and customers. The key to social media marketing success is focusing on the platforms where you're most likely to find your target market, whether it's Instagram, Twitter, LinkedIn, Facebook, or Pinterest. To make sure you're not wasting time with your social media efforts, develop a strategy and content plan, and research tools that will help make the posting easier, such as Hootsuite or Buffer.
You don’t have to be better than your competition at absolutely everything, so long as you identify enough points to build a keyword strategy around. For smaller companies, this means that you probably have to be better at the things the bigger fish haven’t thought of or aren’t actively looking to do. If you can’t think of anything at all then you have a much bigger problem than just coming up with keywords…
Ego and assumptions led me to choose the wrong keywords for my own site. How did I spend three years optimizing my site and building links to finally crack the top three for six critical keywords, only to find out that I wasted all that time? However, in spite of targeting the wrong words, Seer grew the business. In this presentation, Will shows you the mistakes made and share with you the approaches that can help you build content that gets you thanked.

QUOTE: “high quality content is something I’d focus on. I see lots and lots of SEO blogs talking about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for (Google users) and if your content is really useful for them, then we want to rank it.” John Mueller, Google 2015
QUOTE:  Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces). Aaron Wall, 2009
Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
If you have a commenting system (like Drupal, Joomla or WordPress) that allows for search engine friendly links (commonly called dofollow links) from your blog or site, you will probably, eventually be the target of lots of spam, be complicated in tiered link schemes and potentially fall foul of Google’s webmaster guidelines on using the attribute in certain situations.
QUOTE: “The purpose of a page is the reason or reasons why the page was created. Every page on the Internet is created for a purpose, or for multiple purposes. Most pages are created to be helpful for users, thus having a beneficial purpose. Some pages are created merely to make money, with little or no effort to help users. Some pages are even created to cause harm to users. The first step in understanding a page is figuring out its purpose.” Google Search Quality Evaluator Guidelines 2019

QUOTE: “Many SEOs and other agencies and consultants provide useful services for website owners, including: Review of your site content or structure – Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript – Content development – Management of online business development campaigns – Keyword research – SEO training – Expertise in specific markets and geographies.” Google Webmaster Guidelines, 2020
It's important to check that you have a mix of head terms and long-tail terms because it'll give you a keyword strategy that's well balanced with long-term goals and short-term wins. That's because head terms are generally searched more frequently, making them often (not always, but often) much more competitive and harder to rank for than long-tail terms. Think about it: Without even looking up search volume or difficulty, which of the following terms do you think would be harder to rank for?
Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.
QUOTE: “… it also includes things like the comments, includes the things like the unique and original content that you’re putting out on your site that is being added through user-generated content, all of that as well. So while I don’t really know exactly what our algorithms are looking at specifically with regards to your website, it’s something where sometimes you go through the articles and say well there is some useful information in this article that you’re sharing here, but there’s just lots of other stuff happening on the bottom of these blog posts. When our algorithms look at these pages, in an aggregated way across the whole page, then that’s something where they might say well, this is a lot of content that is unique to this page, but it’s not really high quality content that we want to promote in a very visible way. That’s something where I could imagine that maybe there’s something you could do, otherwise it’s really tricky I guess to look at specific changes you can do when it comes to our quality algorithms.” John Mueller, Google 2016
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
×