In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Give people information, or even be more creative. Get people to share pictures of themselves wearing the t-shirt. Create a community of people who are interested in your product. Get a famous person to wear it and share that picture online. Do something different, do something unique. Show Google that you are different and better than the other search results.

What you want here is a primary keyword and hopefully a set of related secondary keywords that share the searcher's intent. So the intent behind of all of these terms and phrases should be the same so that the same content can serve it. When you do that, we now have a primary and a secondary set of keywords that we can target in our optimization efforts.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
Keyword research tells you what topics people care about and, assuming you use the right SEO tool, how popular those topics actually are among your audience. The operative term here is topics -- by researching keywords that are getting a high volume of searches per month, you can identify and sort your content into topics that you want to create content on. Then, you can use these topics to dictate which keywords you look for and target.
For instance, before launching a new product or service, a business can create a simple landing page to gather feedback from the target audience. Or it can run a survey asking a bunch of targeted questions. Or it can even go a step further and create a minimum viable product to see how the target users are interacting with it. With a bit of creativity, PPC ads can help gather real-time feedback that can be used to improve the end product, or idea.
Getting seen by people who are interested in your niche or brand helps you boost your brand’s visibility. Which indirectly affects the amount of business you get. Even if a lot of people are not immediately clicking on your ads, they may search for your product in the future. Either way, the paid ads will help immensely in terms of getting in front of your target audience and filtering out anyone who is not a part of it.
Kelly Wilhelme currently manages all of Weidert Group's marketing efforts. Through her past experience as an inbound marketing consultant on our client service team and, prior to that in financial services communication, she has a deep understanding of complex businesses and a desire to help them grow. Kelly has a passion for communication strategy, layout and design, as well as writing and content creation.
Web search is one of the most powerful tools we have today. Just like you, people from all walks of life use Google to find solutions, learn new things and understand the world around. One of those new things may be determining whether SEO vs SEM is best for your business. Whether you’re an online business or a local business, chances are that people are actively looking for you.
All too often, people dramatically overthink the most basic keyword research concepts; keyword generation should start simply with answering the question of "What products or services do you sell?" If you sell dog food online,  the root words dog and food alone would be very poor keywords because on their own, neither dog nor food do a remotely good job at describing what you sell. Though this example makes it obvious, many times we have to fight through our urge to include those bigger, broader root keywords.
I mean look great from a visual, UI perspective and look great from a user experience perspective, letting someone go all the way through and accomplish their task in an easy, fulfilling way on every device, at every speed, and make it secure too. Security critically important. HTTPS is not the only thing, but it is a big part of what Google cares about right now, and HTTPS was a big focus in 2016 and 2017. It will certainly continue to be a focus for Google in 2018.
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Next, we're trying to serve the searcher's goal and solve their task, and we want to do that better than anyone else does it on page one, because if we don't, even if we've optimized a lot of these other things, over time Google will realize, you know what? Searchers are frustrated with your result compared to other results, and they're going to rank those other people higher. Huge credit to Dan Kern, @kernmedia on Twitter, for the great suggestion on this one.
The amplification and link building is always going to be difficult in a space where all related sites are potential competitors. Take for example a website focusing on news for a particular sport. Sure, you might get forums links, social media links, and other lower quality 'no-follow' links, but then but no news outlets, sports clubs, and the like would be interested in linking to your content because they have their own sports journalists. Likewise, fan sites, and others would be more inclined to link to well established traditional media, instead of your 20k per month visitor sports site.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
We’ve used other tools in the past, but SE Ranking offers more up-to-date data and information, which benefits our agency and clients. SE Ranking allows us to access historical data with just a few clicks without ever having to leave the interface. From daily ranking updates to current search volume trends, there are numerous aspects that are essential when formulating client strategies, and with SE Ranking’s continuously updated system we are able to use this data to help our clients succeed.

However, with a properly created PPC campaign, results can be analyzed and any conversion-related problems can fixed within no time. It shouldn’t be surprising to see massive results from a PPC campaign that’s been running only for a few weeks. When and if you have the budget, getting quick results with PPC is not only possible, it’s completely doable.
Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[32] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[33] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[32] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[32] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.
Secondary (also called “tertiary” or “supporting”) keywords include all other keywords you are targeting and/or incorporating. In some contexts, secondary terms are those you are loosely optimizing for, but they’re just not considered a high priority. In other scenarios, secondary keywords act as the semantic or long-tail support to help you get the most out of your primary keyword targeting.

How do you figure out what keywords your competitors are ranking for, you ask? Aside from manually searching for keywords in an incognito browser and seeing what positions your competitors are in, SEMrush allows you to run a number of free reports that show you the top keywords for the domain you enter. This is a quick way to get a sense of the types of terms your competitors are ranking for.

By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
When it comes to search engine marketing, there may be no larger misnomer, no more archaic term than the ubiquitous keyword. In my view, there should be an official migration to the more accurate term keyphrase, but for now I will be forced to use what I consider to be an inaccurate term. My frustration with this term is that it quite simply implies a single word, which is rarely the strategy that we employ when doing keyword research and selection in the service of PPC and SEO campaigns.
This was just brilliant. I love that it was timely and concise. It was a great reminder of what is really important. It would be awesome if you could do a whiteboard Friday about Sitemaps - Submitted vs Indexed - and steps to take when they have a big discrepancy. Also include the Google index status in the mix as well. Keep it up Rand - your articles offer so much value to this industry.
Consider your competition. Look at what your competitors are doing and how they are performing in their search marketing before you decide how you can best compete with them. Research what search terms they rank organically for. Consider if you can execute a plan to top their SERP placements. Also, look at what paid terms they are using to drive traffic to their own sites. As you perform this research, look for gaps that you can fill and areas where you will be unable to compete in both paid and organic search.
×