QUOTE: “Anytime you do a bigger change on your website if you redirect a lot of URLs or if you go from one domain to another or if you change your site’s structure then all of that does take time for things to settle down so we can follow that pretty quickly we can definitely forward the signals there but that doesn’t mean that’ll happen from one day to next” John Mueller, Google 2016
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.
Learn the steps of Internet analysis, if you want to learn Internet marketing for free. For anyone who knows the basics of social media, search engine optimization (SEO), Google news and website building blocks, then you can likely learn Internet marketing on your own. A big part of marketing is understanding your competitors, data and market analysis.

As keywords define each page of your site, you can use them to organize your content and formulate a strategy. The most basic way to do this is to start a spreadsheet (your "content to keyword map") and identify your primary keyword for each article. You can then build your sheet to your own requirements, add keyword search volume, organic traffic, page authority and any other metrics that are important to your business.

QUOTE: “Fixing the problem depends on the issue you have. For example, if it’s a pop-up, you’ll need to remove all the pop-up ads from your site. But if the issue is high ad density on a page, you’ll need to reduce the number of ads. Once you fix the issues, you can submit your site for a re-review. We’ll look at a new sample of pages and may find ad experiences that were missed previously. We’ll email you when the results are in.” Google, 2017
Consider your competition. Look at what your competitors are doing and how they are performing in their search marketing before you decide how you can best compete with them. Research what search terms they rank organically for. Consider if you can execute a plan to top their SERP placements. Also, look at what paid terms they are using to drive traffic to their own sites. As you perform this research, look for gaps that you can fill and areas where you will be unable to compete in both paid and organic search.

Google Ads (formerly Google Adwords) is the search provider most commonly used for this strategy. With this tactic, brands conduct keyword research and create campaigns that target the best keywords for their industry, products, or services. When users search for those keywords, they see the custom ads at the top or bottom of SERPs. The brand is charged each time a user clicks on the ad.
Don’t be a website Google won’t rank – What Google classifies your site as – is perhaps the NUMBER 1 Google ranking factor not often talked about – whether it Google determines this algorithmically or eventually, manually. That is – whether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search – what do you think Google thinks about your website? Is your website better than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not because it is just the same…. how can you make yours different. Better.
QUOTE: “So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.” Google 2012
Your SEO keywords are the keywords and phrases in your web content that make it possible for people to find your site via search engines. A website that is well optimized for search engines "speaks the same language" as its potential visitor base with keywords for SEO that help connect searchers to your site. Keywords are one of the main elements of SEO.

NOTE, in 2020, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY & DEVICE dependent these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
Google WILL classify your site when it crawls and indexes your site – and this classification can have a DRASTIC effect on your rankings. It’s important for Google to work out WHAT YOUR ULTIMATE INTENT IS – do you want to be classified as a thin affiliate site made ‘just for Google’, a domain holding page or a small business website with a real purpose? Ensure you don’t confuse Google in any way by being explicit with all the signals you can – to show on your website you are a real business, and your INTENT is genuine – and even more important today – FOCUSED ON SATISFYING A VISITOR.
Be sure to re-evaluate these keywords every few months -- once a quarter is a good benchmark, but some businesses like to do it even more often than that. As you gain even more authority in the SERPs, you'll find that you can add more and more keywords to your lists to tackle as you work on maintaining your current presence, and then growing in new areas on top of that.

QUOTE: “common with forums is low-quality user-generated content. If you have ways of recognizing this kind of content, and blocking it from indexing, it can make it much easier for algorithms to review the overall quality of your website. The same methods can be used to block forum spam from being indexed for your forum. Depending on the forum, there might be different ways of recognizing that automatically, but it’s generally worth finding automated ways to help you keep things clean & high-quality, especially when a site consists of mostly user-generated content.” John Mueller, Google
In an evolving mobile-first web, we can utilize pre-empting solutions to create winning value propositions, which are designed to attract and satisfy search engine crawlers and keep consumers happy. I'll outline a strategy and share tactics that help ensure increased organic reach, in addition to highlighting smart ways to view data, intent, consumer choice theory and crawl optimization.
For example, let’s say you were running the website of an online pet store. You might be wise to create one keyword grouping for all your dog-related products, then one for all of your parakeet-related projects, etc. The next step would be to segment each individual group into smaller subgroups (parakeet cages, parakeet toys, parakeet snacks) and then even smaller groups for each type of product (low-fat parakeet snacks, luxury parakeet snacks… you get the idea). Now your pet store can create individual pages optimized for each small keyword group.
To get even more insight and data to help you make those decisions, sign up for a free trial of Alexa’s Advanced Plan. You’ll get access to tools that help you research competitor search and link building strategies, find keyword opportunities, review your site’s SEO, and learn about your target audience. These insights, paired with what you know about SEM and SEO, will help you uncover the best search marketing strategy for your unique brand and goals.
QUOTE: “There’s probably always gonna be a little bit of room for keyword research because you’re kind of providing those words to users. And even if search engines are trying to understand more than just those words, showing specific words to users can make it a little bit easier for them to understand what your pages are about and can sometimes drive a little bit of that conversion process.  So I don’t see these things going away completely but I’m sure search engines will get better over time to understand more than just the words on a page.” John Mueller, Google 2020
At first glance, the Ads or SC appear to be MC. Some users may interact with Ads or SC, believing that the Ads or SC is the MC.Ads appear to be SC (links) where the user would expect that clicking the link will take them to another page within the same website, but actually take them to a different website. Some users may feel surprised or confused when clicking SC or links that go to a page on a completely different website.
QUOTE: “Many SEOs and other agencies and consultants provide useful services for website owners, including: Review of your site content or structure – Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript – Content development – Management of online business development campaigns – Keyword research – SEO training – Expertise in specific markets and geographies.” Google Webmaster Guidelines, 2020
A poor 404 page and user interaction with it, can only lead to a ‘poor user experience’ signal at Google’s end, for a number of reasons. I will highlight a poor 404 page in my audits and actually programmatically look for signs of this issue when I scan a site. I don’t know if Google looks at your site that way to rate it e.g. algorithmically determines if you have a good 404 page – or if it is a UX factor, something to be taken into consideration further down the line – or purely to get you thinking about 404 pages (in general) to help prevent Google wasting resources indexing crud pages and presenting poor results to searchers. I think rather that any rating would be a second order scoring including data from user activity on the SERPs – stuff we as SEO can’t see.

After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
QUOTE: “While, as a whole, web usability has improved over these past several years, history repeats and designers make the same mistakes over and over again. Designers and marketers continuously need to walk a line between providing a good user experience and increasing advertising revenue. There is no “correct” answer or golden format for designers to use in order to flawlessly reach audiences; there will inevitably always be resistance to change and a desire for convention and predictability. That said, if, over the course of over ten years, users are still lamenting about the same problems, it’s time we start to take them seriously.”  Therese Fessenden, Nielsen Norman Group 2017
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.
This relationship between rankings and clicks (and traffic) is strongest amongst the top 3 search results. However, changing layout of the search results pages is constantly changing, with the inclusion of Google’s Knowledge Graph data and the integration of Universal Search elements (SERP Features) like videos, maps and Google Shopping ads. These developments can mean that the top 3 organic rankings are no longer the 3 best positions on the SERP. This has been demonstrated in heatmap and eye-tracking tests.
Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
×