Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
At SEO Analysis you will receive personalized service. Your search engine optimization consultant will review your company's website and provide you with a complete search engine optimization analysis. This analysis is based on current search engine standards and your site. The analysis will include the proper techniques for better search engine placement based on the current search engine standards.
QUOTE: “If you have comments on your site, and you just let them run wild, you don’t moderate them, they’re filled with spammers or with people who are kind of just abusing each other for no good reason, then that’s something that might kind of pull down the overall quality of your website where users when they go to those pages might say, well, there’s some good content on top here, but this whole bottom part of the page, this is really trash. I don’t want to be involved with the website that actively encourages this kind of behavior or that actively promotes this kind of content. And that’s something where we might see that on a site level, as well.” John Mueller, Google 2016
Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters and SEO is that Google doesn’t want business owners to rank for lots of keywords using auto-generated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.
We all have to delete pages at some point, but when that old URL gets visitors, they bump into a 404 Not Found error. Aaaargh! To avoid this, you can redirect them to a new page with relevant information. It’s important to do this systematically to keep your website healthy. The Redirect manager allows you to do just that: after deleting a post or page, the plugin will ask you what to do with the old URL. You can also go to the menu ‘Redirects’ to see and update all your redirected pages. And you can even set ‘REGEX redirects’ to indicate that all URLs containing a certain word or expression should redirect to the same page.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
What would be the purpose of/reason for moving back to a different url? If its been a few years, I’d leave it alone unless you watched everything decline since moving to the main url. Moving the forum to a new url now would probably be a bit chaotic, not only for your main url but for the forum itself…. Only reason I could imagine myself moving the forum in this scenario would be if all those links were really awful and unrelated to the url it currently sits on…
Submit website to directories (limited use). Professional search marketers don’t sub­mit the URL to the major search engines, but it’s possible to do so. A better and faster way is to get links back to your site naturally. Links get your site indexed by the search engines. However, you should submit your URL to directories such as Yahoo! (paid), Business.com (paid) and DMOZ (free). Some may choose to include AdSense (google.com/adsense) scripts on a new site to get their Google Media bot to visit. It will likely get your pages indexed quickly.
Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.

WebSite Auditor scans pages for code errors, duplicate content and other structure-related issues they may have. Other than that, there is this on-page optimization module, which allows determining the ideal keyword placement and researches page elements that can be optimized. In WebSite Auditor you can also analyze competitor’s pages to compare of to improve own on-page strategy. There are actually more features, I just won’t be listing all of them here. But this is the best solution with regard to on-page optimization I found so far.
Google asks quality raters to investigate your reputation by searching “giving the example [“ibm.com” reviews –site:ibm.com]: A search on Google for reviews of “ibm.com” which excludes pages on ibm.com.” – So I would do that search yourself and judge for yourself what your reputation is. Very low ratings on independent websites could play a factor in where you rank in the future – ” with Google stating clearly “very low ratings on the BBB site to be evidence for a negative reputation“. Other sites mentioned to review your business include YELP and Amazon. Often – using rich snippets containing schema.org information – you can get Google to display user ratings in the actual SERPs. I noted you can get ‘stars in SERPs’ within two days after I added the code (March 2014).
Every page on your website should have a title, a subtitle and so on. When search engines scan your website, they’ll understand your content better if you explain them the text hierarchy. The most relevant part is the title of your page and you should define it as H1 (in the Text’s Editor). The H1 should be descriptive the page’s content and you shouldn’t have more than one H1 per page. Choose carefully and don’t forget to include your keywords. Following your H1, is H2, H3 and so on. The clearer your text structure is, the easier search engines will digest your site’s content.

Unfortunately, Google has stopped delivering a lot of the information about what people are searching for to analytics providers. Google does make some of this data available in their free Webmaster Tools interface (if you haven’t set up an account, this is a very valuable SEO tool both for unearthing search query data and for diagnosing various technical SEO issues).


While Google never sells better ranking in our search results, several other search engines combine pay-per-click or pay-for-inclusion results with their regular web search results. Some SEOs will promise to rank you highly in search engines, but place you in the advertising section rather than in the search results. A few SEOs will even change their bid prices in real time to create the illusion that they "control" other search engines and can place themselves in the slot of their choice. This scam doesn't work with Google because our advertising is clearly labeled and separated from our search results, but be sure to ask any SEO you're considering which fees go toward permanent inclusion and which apply toward temporary advertising.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
QUOTE: “(Google Panda) measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages. So essentially, if you want a blunt answer, it will not devalue, it will actually demote. Basically, we figured that site is trying to game our systems, and unfortunately, successfully. So we will adjust the rank. We will push the site back just to make sure that it’s not working anymore.”  Gary Illyes, Google 2016
QUOTE: “Fixing the problem depends on the issue you have. For example, if it’s a pop-up, you’ll need to remove all the pop-up ads from your site. But if the issue is high ad density on a page, you’ll need to reduce the number of ads. Once you fix the issues, you can submit your site for a re-review. We’ll look at a new sample of pages and may find ad experiences that were missed previously. We’ll email you when the results are in.” Google, 2017

Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).


If a PARTICULAR CANONICAL HEAD KEYWORD is IMPORTANT (even perhaps a SYNONYM or LONG TAIL VARIANT) and I think a particular 301 REDIRECT has some positive impact on how Google judges the quality or relevance the page, I will make sure the CANONICAL HEAD KEYWORD and SYNONYMS are on the FINAL PAGE I redirect Google to (which is the one that will be rated and cached).
QUOTE: “You heard right there is no duplicate content penalty… I would avoid talking about duplicate content penalties for spam purposes because then it’s not about duplicate content then it’s about generating in a very often in an automated way content that is not so much duplicated as kind of fast and scraped from multiple places and then potentially probably monetized in some way or another and it just doesn’t serve any purpose other than to gain traffic redirect traffic maybe make some money for the person who created it this is not about content that gets really sort of created for any kind of reason other than just to be there so I person I don’t think about it as duplicate content there is just this disparity it’s the same thing as saying that maybe we created gibberish is some sort of a duplicate content of words but otherwise also exists like I I would really separate those two issues because the things that Aamon led in with URLs generated on WordPress and the things they were going to discuss today later maybe about e-commerce sites and so on I wouldn’t want people to confuse those two.”  Andrey Lipattsev, Google 2016
Before your website goes live, you need to select a URL. Also known as your domain name, it’s the address that visitors will type in to find your site. Like the giant sign above a storefront window, it’s one of the first things visitors see when they come to your site. That’s why it’s also the first place Google looks to understand what your site is about and decide how to rank it. It’s also important to make sure your URLs are clean and beautiful. This means no special characters, no hashbangs, no page ID. You get the point. 
After the audit has been completed, your team will be invited to a presentation in which your SEO specialist will talk through the findings and recommendations. The Three Deep SEO team will walk you and your team though the roadmap to completion so you know what to expect and when. In addition, you will receive a comprehensive analysis of your site’s health. All of these are customized to you and your specific situation.
×