This is “Emerging Trends”, section 6.6 from the book Online Marketing Essentials (v. 1.0). For details on it (including licensing), click here.

For more information on the source of this book, or why it is available for free, please see the project's home page. You can browse or download additional books there. To download a .zip file containing this book to use offline, simply click here.

Has this book helped you? Consider passing it on:
Creative Commons supports free culture from music to education. Their licenses helped make this book available to you. helps people like you help teachers fund their classroom projects, from art supplies to books to calculators.

6.6 Emerging Trends

Learning Objective

  1. Understand where the future of SEO (search engine optimization) is headed.

SEO (search engine optimization) is a constantly evolving activity. As the search engine algorithms become more sophisticated, they assess Web sites in more complex ways to arrive at suitable search results. There are significant changes on the horizon in how search engines will index and present their results. These changes are aligned with a goal of sorting through the exponentially increasing amounts of data available on the Web and giving users better and more targeted search results, which they will find useful.

There are four strong emerging trends for SEO:

  • LocalizationThe practice of creating a local version of a Web site for a different country or language.
  • Personalized search
  • Usage data
  • Real-time search

These four trends are making optimizing Web sites for search engines even more complex.

Localization and Personalization

The first two trends revolve around how the search engines are trying to accommodate the geographic and personal preferences of a user so as to present them with the best contextual results. By localizing, the search engines are presenting information in the language and geographic context of the user.

In personalizing search, the search engines are trying to align with what they have determined would be more appropriate for that specific user. Personalized search targets users’ preferences on two bases: explicitly and implicitly.

Explicit personalized search is based on an interpretation, by the search engines, of data and information the users provide directly to search engines, such as location, age, language, and gender.

Implicit personalized search is based on data and information search engines gather by analyzing users’ behavior. Search engines will track the pages users frequently visit or how they use certain sites—such as Gmail or bookmarking sites. Based on this, the search engines will predict what a user is probably interested in and tailor the results it presents to that user accordingly.

Surveys suggest that users are generally in favor of personalizationSearch results are personalized to, or vary according to, what search engines think a user is actually interested in. and are willing to trade personal information to benefit from better-quality results from the search engines. Large search engines, like Google, are even offering end users the opportunity to tell them what results they like—through the launch of user feedback mechanisms such as Google Search Wiki (launched in 2008).This kind of functionality allows the user to tell search engines what results they like or don’t like and would like to see (or not) again.

To optimize a site properly, factors like personalization and localization need to be taken into account and the site needs to be honed to do the following:

  • Adapt to how the search engines will measure and index the sites
  • Adapt to how users will expect to be presented with contextualized information

Usage Data

Search engines want their results to be highly relevant to Web users to make sure that Web users keep returning to the search engine for future searches. And the best way to establish relevance to users—how they use Web sites, of course!

Usage data are the most effective way of judging the true relevancy and value of a Web site. For example, if users arrive on a Web site and go back immediately, chances are it wasn’t relevant to their query in the first place. However, if a user repeatedly visits a Web site and spends a long time on the site, chances are it is extremely relevant. When it comes to search engines, relevant valuable sites get promoted, and irrelevant sites get demoted.

How Do Search Engines Access These Data?

Search engines use cookies to maintain a history of a user’s search activity. This will include keywords used and Web sites visited from the search engine. Search engines gather data on the click-through rate of results and on bounce rates.

Most search engines also provide other services, all of which can be used to gather data relevant to search. For Google, some examples of these services include the following:

  • Google AdWords
  • Google AdSense
  • Google Analytics
  • Google Web Site Optimizer
  • Google Checkout
  • Google Toolbar

As of 2010, this is a relatively new area of SEO. It no doubt plays a part in search engine rankings, and that contribution is set to grow. So what does this mean for SEO? When it comes to a Web site, SEO must do the following:

  • Be valuable enough to attract both visitors and links naturally
  • Retain visitors and make sure they return to the Web site
  • Convert visitors

What Not to Do

Black-hat SEOUnethical techniques that are used to get higher search rankings. refers to practices that attempt to game the search engines. Should a search engine uncover a Web site that is using unethical practices to achieve search engine rankings, it is likely to remove that Web site from its index.


In 2006, Google found that the BMW Germany Web site was using a JavaScript URL (uniform resource locator) redirect to send search engine spiders and Web visitors to different pages with different content. It was removed from the Google index until the Webmaster had ensured that the Web site met Google’s guidelines.

Google publishes guidelines for Webmasters, available through Google’s Webmaster Central ( As well as outlining best practice principles, Google has supplied the following list of precautions:

  • Avoid hidden text or hidden links.
  • Don’t use cloaking or sneaky redirects.
  • Don’t send automated queries to Google.
  • Don’t load pages with irrelevant keywords.
  • Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
  • Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other malware.
  • Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

The bottom line: design Web sites for users first and foremost, and don’t try to trick the search engines.

Real-Time Search

Google offered a new feature called real-time search, which went live in December 2009. It’s designed to further increase the optimal user experience with search and follows earlier features on Google search engine results pages (SERPs), which now regularly include images, news items, videos, and shopping listings. With real-time search, Google now displays a dynamic function in its SERPS, where you can see the latest mentions or URLs published on the Web, related to your search term.

This is ideal for social media and microblogging purposes, and Google has partnered with the likes of Twitter, MySpace, FriendFeed, Jaiku,, and other online businesses to offer this initiative. It opens up a number of opportunities and increases the importance of a strong social media presence to augment your search engine marketing efforts.

Tools of the Trade

There are a number of tools available to assist with SEO. Some are made available by search engines, and some are developed by agencies and individuals who specialize in SEO. Most are available for free.

Google Webmaster Tools


Google provides guidelines to Webmasters and tools to help ensure your Web site is being indexed.

Quirk SearchStatus


Quirk SearchStatus is a Firefox extension that allows you to easily view key SEO information related to the page you are visiting. As well as linking to Alexa and Compete rankings and a Whois look up, Quirk SearchStatus will highlight keywords on a page and allow you to easily access link reports from each of the major search engines.

Tools from

URL: provides a number of tools that assist any SEO. For example, Rank Checker is a Firefox extension that allows you to save a number of keywords and to perform regular searches on them, giving you the ranking of your chosen URL for each keyword in the search engines selected. They also have tools to help with keyword discovery.

Tools from SEOMoz


SEOMoz provides a wealth of articles and forums, as well as excellent SEO tools and guides. Some are free, but become a “PRO” member to access them all.

Keyword Discovery Tools

There are a number of tools available, some free and some that require a fee, to assist with keyword discovery. Some include Trellion’s Keyword Discovery Tool ( and Wordtracker (

Online Forums

Webmaster World ( is frequented by SEOs and Webmasters aiming to stay current with latest trends and search engine updates.

Google’s Free Search Engine Optimization Guide


Google provides a free starter guide, useful for anyone new to SEO.

Google Insights for Search


This provides valuable information about search terms you may want to target for SEO purposes. It also provides regional interest (i.e., by geography) for search terms, which is increasingly important as search engines move toward localization and personalization in their search focus.

Pros and Cons

Optimizing a Web site for search engines should entail optimizing the Web site for users. Done properly, it should result in a better user experience, while ensuring that search engines index and rank the Web site well.

However, it can be tempting to focus on the technicalities of SEO while forgetting that both robots and humans need to read the same Web site. One should not be sacrificed for the other.

Search engines update their algorithms regularly. Each update is an attempt to improve search results but can result in loss of rankings for some Web sites, depending on the update. A contingency plan, such as a prepared PPC (pay-per-click) campaign, needs to be in place to cope with a sudden drop in rankings.

As with any eMarketing practice, SEO should not be the only focus of eMarketing efforts. It works best when part of a holistic eMarketing strategy.

Key Takeaways

  • Search engine optimization (SEO) is a constantly evolving activity. Search engine algorithms become more sophisticated and continue to evaluate Web sites in more complex ways.
  • There are three strong emerging trends for SEO:

    • Localization
    • Personalization search
    • Usage data
  • By localizing search, search engines are presenting information in the language and geographic context of the user.
  • In personalizing search, the search engines are trying to align with what they have determined would be more appropriate for that user.
  • To optimize a site properly, factors like personalization and localization need to be taken into account.
  • Usage data are the most effective way of judging the true relevancy and value of a Web site. Relevant sites get promoted, while irrelevant ones get demoted.
  • Search engines use cookies to maintain a history of a user’s search activity. These data include keywords used and Web sites visited with the search engine. Search engines provide search services, and can be used to gather data relevant to search.
  • If a Web site uses black-hat SEO practices, and is caught by a search engine, the search engine is likely to remove the offending site from its index. Google has a list of dos and don’ts associated with SEO for those who run Web sites.


  1. Why do you think search engines adjust their algorithms so many times during the course of a year? What would happen if they didn’t make these adjustments?
  2. Outline a mock Web page about a luxury auto brand. Using the techniques you have read about in this chapter, write two to three paragraphs optimized for the Web. What other elements would you include on this mock Web page that is relevant to the brand? Be creative!