SEO (search engine optimization) is the process of improving ranking in search engine results. We believe that video explanation is better than a thousand of word explanation. Credits: thanks to common craft and Search Engine Land for the following video.
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers.
Companies that employ overly aggressive techniques, like manipulating rankings by stuffing pages with excessive or irrelevant keywords, can get their client websites banned from the search results.
Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program that helps webmasters learn if Google is having any problems indexing their website, and also provides data on traffic to the website. Yahoo! provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index, and view link information. Bing allows webmasters to submit a sitemap and web feeds, as well as determine the crawl rate and the number of pages that have been indexed by the search engine.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines’ market shares vary from market to market, as does competition.
As of 2009, there are only a few large markets where Google is not the dominant search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. Notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic.
Successful SEO for international markets may require professional translation of web pages, registration of a domain name with a top-level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of SEO are essentially the same.
SEO is not necessarily an appropriate strategy for every website. There are other, potentially more effective, Internet marketing strategies depending on the site operator’s goals. A successful Internet marketing campaign may drive organic traffic to web pages through optimization techniques, but it may also involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their success, and improving a site’s conversion rate.
SEO may generate a return on investment. However, search engines are not paid for organic search traffic, their algorithms may change, and there are no guarantees of continued referrals. Due to these uncertainties, a business that relies heavily on search engine traffic can suffer losses if the search engines stop sending visitors. It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.
Webmasters and content providers first began optimizing sites for search engines in the mid-1990s. At the time, all webmasters needed to do was submit the address of a page or URL to the various engines, which would then send a “spider” to “crawl” that page, downloading and storing it on the search engine’s own server. There, a second program known as an indexer would extract the information on the page, including the words it contains, the location of these words, the weight of specific words, and all links to other pages. This information is placed into a scheduler for crawling at a later date.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, which provides a guide to each page’s content. However, using meta data to index pages was found to be less than reliable because the webmaster’s choice of keywords could be an inaccurate representation of the site’s actual content. Incomplete data in meta tags could cause pages to rank for irrelevant searches. Web content providers could also manipulate a number of attributes within the HTML source of a page in order to rank well in search engines.
By relying so much on factors such as keyword density, which were within a webmaster’s exclusive control, early search engines suffered from abuse and ranking manipulation. Since the popularity of a search engine is determined by its ability to produce the most relevant search results, search engines had to adapt by developing more complex ranking algorithms, taking into account additional factors that were more difficult to manipulate.
By 2004, leading search engines like Google and Yahoo had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.
Real-time search was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically, site administrators spend months or even years optimizing a website to increase search rankings. With the growth of social media sites and blogs, the leading engines had to make changes to their algorithms to allow fresh content to rank quickly within the search results.