Search Engine Optimization

Search Engine Optimization (SEO) improves the visibility of a website in search engines via “natural” or unpaid search results. In other words, it gives a website web presence. Generally, the earlier a site appears in the search results list, the more visitors it will receive.
SEO is an Internet marketing strategy that takes into consideration how search engines work, and what people search for. Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords, and remove barriers to the indexing activities of search engines. Another tactic is promoting a site to increase the number of back links, or inbound links. These tactics may be incorporated into a website’s development and design.

SEO may be used to target different kinds of searches, including local searches, image searches, video searches, and industry-specific vertical searches. It can be a stand-alone strategy, or one part of a broader marketing campaign.

Seo Methods – Getting Indexed

Leading search engines, such as Google, Yahoo! and Bing, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantees crawling for either a set fee or cost per click. Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results. Two major directories, the Yahoo Directory and the Open Directory Project, require manual submission and human editorial review. Google offers Google Webmaster Tools with which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links.

Search engine crawlers may look at a number of different factors when crawling a site, and may not index every page. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.

How Search Works By Matt Cutts

Preventing Crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. In addition, a page can be explicitly excluded from a search engine’s database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. However, as a search engine crawler may keep a cached copy of this file, it may occasionally crawl pages that should not be crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts, and user-specific content such as search results from internal searches. In 2007, Google cautioned against the indexing of internal search results because those pages are considered search spam.

Increasing Prominence

There are a variety of methods to increase the prominence of a webpage within the search results. Cross-linking between pages of the same website to provide more links to the most important pages, may improve its visibility. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Writing content that includes frequently searched keyword phrases, so as to be relevant to a wide range of search queries will tend to increase traffic. Adding relevant keywords to a webpage’s meta data, including the title tag and meta description, also tends to increase traffic as it improves the relevancy of a site’s search listings. URL normalization of web pages that are accessible via multiple URLs, using the “canonical” meta tag or 301 redirects, can help ensure links to different versions of the URL all count towards the page’s link popularity score.

White Hat versus Black Hat

SEO techniques are sometimes classified into two broad categories – white hat techniques that search engines recommend as part of a good design, and black hat techniques that search engines do not approve of and attempt to minimize the effect of. The latter is referred to as spamdexing. White hats tend to produce results that last, whereas black hat practitioners can expect to have their sites banned once the search engines discover what they are doing.

A SEO technique or method is considered white hat if it conforms to the search engines’ guidelines and involves no deception. This is an important distinction to note, as the search engine guidelines are not a series of rules or commandments, and it is also about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.

White hat techniques are generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm.

White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.

White hat SEO is simply effective marketing, making efforts to deliver quality content to an audience that has requested it. Traditional marketing allows this through transparency and exposure, something a search engine’s algorithm takes into account.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. For example, practitioners use text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another technique known as cloaking gives a different page, depending on whether a human visitor or a search engine is requesting the page.

Search engines may penalize sites using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied automatically by the search engines’ algorithms, or by a manual site review.

Useful SEO Resources:

Plaecs to get great SEO tool:

Google SEO starter Guide

Searching do follow blog:

Free Backlink from 2512 websites.

Automatically add your link to 1,906 sites

WhoisX – Check domain availability for .net, .com, .org, .info, .name, .us, .biz, .ca .tv

Internet speed check – Check internet upload and download speed – places to get backlink
Top 100 do follow forums to increase pagerank

200 places to submit RSS feed.

Do Follow Forum:



Sync Facebook to Twitter > Twitter to Facebook

Do follow social bookmarking (Not tested submission):