QUOTE: “Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.” Sitemaps.org, 2020
TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
For instance, in a recent test (2019), if you have a page title longer than 12 words, all the keywords beyond 12 words evaporated from the page copy. This is a change from the way Google used to work, when the extra words were included as part of the page copy, not just part of the title. So, if you have a 15 word title, the last 3 words will not count towards ranking, if that test result was to be replicated.

QUOTE – “So it’s not that our systems will look at your site and say oh this was submitted by a user therefore the site owner has like no no control over what’s happening here but rather we look at it and say well this is your website this is what you want to have indexed you kind of stand for the content that you’re providing there so if you’re providing like low quality user-generated content for indexing then we’ll think well this website is about low quality content and spelling errors don’t necessarily mean that it’s low quality but obviously like it can go in all kinds of weird directions with user-generated content…” John Mueller, Google 2019

Most small businesses owners and marketers know a little something about SEO (search engine optimization) and the different tactics to help your website rank well in organic search engine results. Another important tactic for any Internet business to know about is SEM (search engine marketing), which includes things such as search engine optimization, paid listings and other search engine related services.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018
Email Marketing: Email can be an effective way to maintain a connection between your business and its customers. You can purchase email addresses of customers and prospective customers, but the best results usually come from emails collected on your website. You can entice people to give you their emails through a quality free offer, such as a downloadable resource, called a lead magnet. Once you have their email, you can send a newsletter, special offers, and other information your target market would be interested in—as long as you follow laws and regulations around email marketing.

Having a Facebook page is almost necessary for digital marketing efforts in the current online marketplace. Having multiple social media accounts on the right networks is even better. Moreover, posting relevant content regularly and building an engaged audience of followers can help grow your business immensely. Finally, paid social media campaigns have become immensely effective, even rivaling the similar efforts made on search engines.


Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Available On-Demand   In this second session of our series on risk management, GRC expert Gerard Scheitlin reviews common risk measurement methodologies, along with a discussion of developing risk metrics and a risk appetite. The utilization of a data-driven approach to identifying and quantifying risk will be covered.   After watching this session, you will be familiar … Continue Reading...
QUOTE: “They follow the forms you gather data you do so and so and so forth but they don’t get any laws they don’t haven’t found out anything they haven’t got anywhere yet maybe someday they will but it’s not very well developed but what happens is an even more mundane level we get experts on everything that sound like this sort of scientific expert they they’re not scientist is a typewriter and they make up something.”  Richard Feynman, Physicist 1981

There are many ways to determine which efforts are producing results and which ones aren't. For example, you can study your website's analytics through your web host or by using Google Analytics. Most social media sites provide analytics as well, or you can use tools such as Hootsuite to get social media analytics. Your email service should also provide you with information on the open rates and engagement rates for your emails.
You’ll likely compile a lot of keywords. How do you know which to tackle first? It could be a good idea to prioritize high-volume keywords that your competitors are not currently ranking for. On the flip side, you could also see which keywords from your list your competitors are already ranking for and prioritize those. The former is great when you want to take advantage of your competitors’ missed opportunities, while the latter is an aggressive strategy that sets you up to compete for keywords your competitors are already performing well for.
Onsite, consider linking to your other pages by linking to pages within main content text. I usually only do this when it is relevant – often, I’ll link to relevant pages when the keyword is in the title elements of both pages. I don’t go in for auto-generating links at all. Google has penalised sites for using particular auto link plugins, for instance, so I avoid them.
This broken-link checker makes it easy for a publisher or editor to make corrections before a page is live. Think about a site like Wikipedia, for example. The Wikipedia page for the term "marketing" contains a whopping 711 links. Not only was Check My Links able to detect this number in a matter of seconds, but it also found (and highlighted) seven broken links.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Online reviews have become one of the most important components in purchasing decisions by consumers in North America. According to a survey conducted by Dimensional Research which included over 1000 participants, 90% of respondents said that positive online reviews influenced their buying decisions and 94% will use a business with at least four stars. Interestingly, negative reviews typically came from online review sites whereas Facebook was the main source of positive reviews. Forrester Research predicts that by 2020, 42% of in-store sales will be from customers who are influenced by web product research.
Understand and accept why Google ranks your competition above you – they are either: more relevant and more popular, more relevant and more reputable, better user experience or manipulating backlinks better than you. Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete – relying on Google to take action on your behalf is VERY probably not going to happen.
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
Off-page SEO builds a website’s reputation and authority by connecting it to other high-quality websites. Off-page SEO techniques include: link building (acquiring high-quality backlinks) from other websites and managing local listings and directory profiles. When many websites link to a brand’s website, it shows search engines that the brand’s website is trustworthy, reliable, and reputable, which increases its search rankings.
Understanding the balance of terms that might be a little more difficult due to competition, versus those terms that are a little more realistic, will help you maintain a similar balance that the mix of long-tail and head terms allows. Remember, the goal is to end up with a list of keywords that provide some quick wins but also helps you make progress toward bigger, more challenging SEO goals.
If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2020 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.

QUOTE: “So there’s three things that you really want to do well if you want to be the world’s best search engine you want to crawl the web comprehensively and deeply you want to index those pages and then you want to rank or serve those pages and return the most relevant ones first….. we basically take PageRank as the primary determinant and the more PageRank you have that is the more people who link to you and the more reputable those people are the more likely it is we’re going to discover your page…. we use page rank as well as over 200 other factors in our rankings to try to say okay maybe this document is really authoritative it has a lot of reputation because it has a lot of PageRank … and that’s kind of the secret sauce trying to figure out a way to combine those 200 different ranking signals in order to find the most relevant document.” Matt Cutts, Google
I like the competition analysis tools, it provides paid and organic data, which gives me an idea on how to catch up and outrank the immediate competition for my clients. It also provides data for the potential traffic, which helps show clients the potential gains of the campaign. And with the marketing plan, I know what needs to be improved in order to get results for my clients.
Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
×