QUOTE: “Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high-quality search results that are optimized for their devices”. GOOGLE

QUOTE: “So it’s not something where we’d say, if your website was previously affected, then it will always be affected. Or if it wasn’t previously affected, it will never be affected.… sometimes we do change the criteria…. category pages…. (I) wouldn’t see that as something where Panda would say, this looks bad.… Ask them the questions from the Panda blog post….. usability, you need to work on.“ John Mueller, Google.
QUOTE: “An infinite number of niches are waiting for someone to claim them. I’d ask yourself where you want to be, and see if you can find a path from a tiny specific niche to a slightly bigger niche and so on, all the way to your desired goal. Sometimes it’s easier to take a series of smaller steps instead of jumping to your final goal in one leap.” Matt Cutts, Google 2006
A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
QUOTE: “And we do that across the whole website to kind of figure out where we see the quality of this website. And that’s something that could definitely be affecting your website overall in the search results. So if you really work to make sure that these comments are really high quality content, that they bring value, engagement into your pages, then that’s fantastic. That’s something that I think you should definitely make it so that search engines can pick that up on.”  John Mueller, Google 2016

So: how to proceed? On the one hand, SEO best practices recommend that you include relevant keywords in a number of high-attention areas on your site, everywhere from the titles and body text of your pages to your URLs to your meta tags to your image file names. On the other hand, successfully optimized websites tend to have thousands or even millions of keywords. You can't very well craft a single, unique page for every one of your keywords; at the same time, you can't try to cram everything onto a handful of pages with keyword stuffing and expect to rank for every individual keyword. It just doesn't work that way.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

QUOTE: “Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
Did you know that nearly 60% of the sites that have a top ten Google search ranking are three years old or more? Data from an Ahrefs study of two million pages suggests that very few sites less than a year old achieve that ranking. So if you’ve had your site for a while, and have optimized it using the tips in this article, that’s already an advantage.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
If you're not using internet marketing to market your business you should be. An online presence is crucial to helping potential clients and customer find your business - even if your business is small and local. (In 2017, one third of all mobile searches were local and local search was growing 50% faster than mobile searches overall.) Online is where the eyeballs are so that's where your business needs to be. 

Hey Sharon, great post! Re. dwell time – I’ve read conflicting opinions, some saying that Google DOES consider it an ‘important’ ranking signal, and others saying that it doesn’t, because dwell time can sometimes be a misleading indicator of content quality. For example when a user searches for something specific and finds the answer immediately in the recommended page (meaning that the content on the page is actually spot on) so he returns to the SERPs very quickly. I have been unable to locate any definitive statements (written/spoken) from anyone at Google that suggest that dwell time IS still a factor in ranking considerations, but it makes sense (to me, anyway) that it should be. Do you have any ‘proof’ one way or the other re. whether Google definitely considers dwell time or not?

QUOTE: “alt attribute should be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is alt=”big blue pineapple chair.” title attribute should be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, title=”View a larger version of the big blue pineapple chair image.” John Mueller, Google 2008
In the last year, Google and Bing have both indicated a shift to entity-based search results as part of their evolution. Google has unscored this point with rich snippets and Knowledge Graph, and Bing has now upped the ante on personal search results with Bing Snapshots. Find out how you can adopt strategies to stay ahead of the curve in the new world of semantic search results.
QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules).”

QUOTE: “If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.” Matt Cutts, Google 2013
Sometimes, Google turns up the dial on demands on ‘quality’, and if your site falls short, a website traffic crunch is assured. Some sites invite problems ignoring Google’s ‘rules’ and some sites inadvertently introduce technical problems to their site after the date of a major algorithm update and are then impacted negatively by later refreshes of the algorithm.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
QUOTE: “If an off-domain link is made by an anonymous or unauthenticated user, I’d use nofollow on that link. Once a user has done a certain number of posts/edits, or has been around for long enough to build up trust, then those nofollows could be removed and the links could be trusted. Anytime you have a user that you’d trust, there’s no need to use nofollow links.” Matt Cutts, Google 2006
There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2020 is still mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK in 2020):
Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).
Let’s look at an example. At Yoast, we call our courses platform the “Yoast Academy.” So at first, it might seem logical for us to optimize for the keyword “Yoast Academy.” However, when we analyze traffic data, it turns out that our audience searches for “Yoast courses” way more often. So it makes much more sense to optimize for that term instead. Every company has its internal vocabulary, which often doesn’t match the vocabulary of its audience. Therefore, you should always select your keywords from your audience’s perspective. You can use Google Trends to research how often certain search terms are used compared to other terms.
Online reviews have become one of the most important components in purchasing decisions by consumers in North America. According to a survey conducted by Dimensional Research which included over 1000 participants, 90% of respondents said that positive online reviews influenced their buying decisions and 94% will use a business with at least four stars. Interestingly, negative reviews typically came from online review sites whereas Facebook was the main source of positive reviews. Forrester Research predicts that by 2020, 42% of in-store sales will be from customers who are influenced by web product research.

We’ve used other tools in the past, but SE Ranking offers more up-to-date data and information, which benefits our agency and clients. SE Ranking allows us to access historical data with just a few clicks without ever having to leave the interface. From daily ranking updates to current search volume trends, there are numerous aspects that are essential when formulating client strategies, and with SE Ranking’s continuously updated system we are able to use this data to help our clients succeed.
Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.
But essentially the idea there is that this is a good representative of the the content from your website and that’s all that we would show to users on the other hand if someone is specifically looking for let’s say dental bridges in Dublin then we’d be able to show the appropriate clinic that you have on your website that matches that a little bit better so we’d know dental bridges is something that you have a lot on your website and Dublin is something that’s unique to this specific page so we’d be able to pull that out and to show that to the user like that so from a pure content duplication point of view that’s not really something I totally worry about.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
×