Great guide Sharon! Thank you so much for sharing. I was wondering if off-page SEO is still worth it? Like using Personal Publishing Accounts or other social media where you can share your content. I’ve been trying for some months now to spread around content but still waiting for better results. I’ve read it needs diversity but I still haven’t figured it out yet.
If you have a commenting system (like Drupal, Joomla or WordPress) that allows for search engine friendly links (commonly called dofollow links) from your blog or site, you will probably, eventually be the target of lots of spam, be complicated in tiered link schemes and potentially fall foul of Google’s webmaster guidelines on using the attribute in certain situations.
According to the U.S. Commerce Department, consumers spent $453.46 billion on the web for retail purchases in 2017, a 16.0% increase compared with $390.99 billion in 2016. That’s the highest growth rate since 2011, when online sales grew 17.5% over 2010. Forrester predicts that online sales will account for 17% of all US retail sales by 2022. And digital advertising is also growing strongly; According to Strategy Analytics, in 2017 digital advertising was up 12%, accounting for approximately 38% of overall spending on advertising, or $207.44 billion.
The SERP is the front-end to Google's multi-billion dollar consumer research machine. They know what searchers want. In this data-heavy talk, Rob will teach you how to uncover what Google already knows about what web searchers are looking for. Using this knowledge, you can deliver the right content to the right searchers at the right time, every time.
Keywords are important because they are the linchpin between what people are searching for and the content you are providing to fill that need. Your goal in ranking on search engines is to drive organic traffic to your site from the search engine result pages (SERPs), and the keywords you choose to target (meaning, among other things, the ones you choose to include in your content) will determine what kind of traffic you get. If you own a golf shop, for example, you might want to rank for "new clubs" — but if you're not careful, you might end up attracting traffic that's interested in finding a new place to dance after dark.

Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Hey Sharon, great post! Re. dwell time – I’ve read conflicting opinions, some saying that Google DOES consider it an ‘important’ ranking signal, and others saying that it doesn’t, because dwell time can sometimes be a misleading indicator of content quality. For example when a user searches for something specific and finds the answer immediately in the recommended page (meaning that the content on the page is actually spot on) so he returns to the SERPs very quickly. I have been unable to locate any definitive statements (written/spoken) from anyone at Google that suggest that dwell time IS still a factor in ranking considerations, but it makes sense (to me, anyway) that it should be. Do you have any ‘proof’ one way or the other re. whether Google definitely considers dwell time or not?

Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
When I think ‘Google-friendly’ these days – I think a website Google will rank top, if popular and accessible enough, and won’t drop like a f*&^ing stone for no apparent reason one day, even though I followed the Google SEO starter guide to the letter….. just because Google has found something it doesn’t like – or has classified my site as undesirable one day.
Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.
Alt text (alternative text), also known as "alt attributes" describe the appearance and function of an image on a page. Alt text uses: 1. Adding alternative text to photos is first and foremost a principle of web accessibility. Visually impaired users using screen readers will be read an alt attribute to better understand an on-page image. 2. Alt tags will be displayed in place of an image if an image file cannot be loaded. 3. Alt tags provide better image context/descriptions to search engine crawlers, helping them to index an image properly.
Keyword research tells you what topics people care about and, assuming you use the right SEO tool, how popular those topics actually are among your audience. The operative term here is topics -- by researching keywords that are getting a high volume of searches per month, you can identify and sort your content into topics that you want to create content on. Then, you can use these topics to dictate which keywords you look for and target.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
SEM is better for testing than SEO. Because you can immediately turn SEM paid ads off and on, it’s a great strategy for testing. You can quickly revise your ad copy, target new audiences, and change landing page content to test your new tactics. This flexibility allows you to see differences in your strategies immediately. You cannot accomplish this through SEO, as it would take too much time to make changes and monitor differences in results.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Google Search Console if you sign up.

I would say that it definitely depends on the niche that you’re in. If your niche is quite small, then it’s probably easier for you to rank for your mid-tail keywords and you should definitely give it a go! Otherwise, it might be better to start with your long-tail keywords and work your way up to mid-tail keywords and eventually your head keywords. So, I’d say: analyse your competition, see where you stand and establish your game plan! Good luck Naved :)!


QUOTE: “One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content. This is less of a technical issue than a general quality one, and in my opinion, not something that’s limited to Google’s algorithms. If you want to create a fantastic experience for everyone who visits, if you focus on content created by users, then you generally need to provide some guidance towards what you consider to be important (and sometimes, strict control when it comes to those who abuse your house rules). When I look at the great forums & online communities that I frequent, one thing they have in common is that they (be it the owners or the regulars) have high expectations, and are willing to take action & be vocal when new users don’t meet those expectations.” John Mueller, Google 2016
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
×