QUOTE: “There’s probably always gonna be a little bit of room for keyword research because you’re kind of providing those words to users. And even if search engines are trying to understand more than just those words, showing specific words to users can make it a little bit easier for them to understand what your pages are about and can sometimes drive a little bit of that conversion process. So I don’t see these things going away completely but I’m sure search engines will get better over time to understand more than just the words on a page.” John Mueller, Google 2020
Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)
Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Love how you just dive into the details for this Site Audit guide. Excellent stuff! Yours is much much easier to understand than other guides online and I feel like I could integrate this to how I site audit my websites and actually cut down the time I make my reports. I only need to do more research on how to remove “zombie pages”. If you could have a ste-by-step guide to it, that would be awesome! Thanks!
For instance, in a recent test (2019), if you have a page title longer than 12 words, all the keywords beyond 12 words evaporated from the page copy. This is a change from the way Google used to work, when the extra words were included as part of the page copy, not just part of the title. So, if you have a 15 word title, the last 3 words will not count towards ranking, if that test result was to be replicated.
An SEO audit should not be rushed. It simply takes time to uncover root causes of the issues affecting your online health. Depending on the size of your site, a proper audit can take anywhere from 2-6 weeks to complete. Due diligence is required when making major changes to any website, and an SEO specialist must conduct a thorough investigation to make accurate, impactful recommendations.