Head 2008 | Basic SEO | SEO Tips | SEO Factors | SEO Analyst Chennai

Search Engine Optimization

Search engine optimization (SEO) is a website promotion strategy, the goal of which is to get your site to appear at the top of search engine results under keywords or phrases appropriate to your business. For example, a site that sells CDs online will want to be found in major search engines such as Google, Yahoo, MSN, AltaVista and AOL under key phrases such as "online CDs," "buy CDs," or "CDs for sale."

There are 1000s of factors that influence rankings - SEOers discover those factors impliment strategies to increase their rankings based on those factors.

Ultimate Link Building Guide To SEO

One of the most important ingredients in a good SEO campaign is link building. It can also be one of the toughest. Searchengines can be a little tough on unethical link building strategies and if you add the growing use of ‘nofollow’ attributesto links, gaining quality links takes a lot of work. This list itemizes most of the link building opportunities available.

10: Link Farm: Link farms are outdated and generally result in search engine penalties. No longer worth thinking about.

09: Paid Reviews: There is an old saying in the IT world - GIGO - garbage in garbage out. Generally speaking paid reviews arepoorly written articles on non-relevant sites. If you can get a good review on highly ranked relevant site then you may getvalue for your money - otherwise treat with caution.

08: Text Link Ads: There can be value in obtaining text link ads but these are generally from low ranking sites. As with the paid reviews, obtaining text link ads on high ranking related sites may be of value.

07: Link Solicitation: Obtaining links from related sites by solicitation can be difficult. You need to be able to prove tothe site owner that there is a mutual benefit. The higher the site ranks the harder it can become. However this can provide agood cross section of related sites.

06: Hand Edited Directories: DMOZ and Yahoo are two of the major directories. DMOZ can take up to 12 months to get listed.Yahoo!, whilst a paid listing, remains reasonably good value. Directory links seem to have been reduced in value recently.

05: Article Directories: Article directories are hit and miss. You can spend hours putting together a series of articles only to find that most of them get scraped without any links back to your site. When the article is used legitimately you have no control as to the sites relevance or reliability.

04: Social Bookmarking: Having your site listed and voted on in some of the social bookmarking sites can provide somebenefits to your link building campaign. Many social bookmarking sites incorporate ‘nofollow’ so don’t rely on them for masses of links. The traffic generated by lead to an increase in organic links.

03: Press Releases: Press releases can provide a quick lift in rankings. If the press release is well written and provides alittle spark then you may find an increase in traffic along with an increase in organic links.

02: Edu and Gov sites: Obtaining links from these sites can be hard. The more important the link location the higher value.

For example, a link in and Edu report or research paper is highly prized.

01: Blended Link Building: Blended link building incorporates a variety of strategies. The aim is to obtain links from a variety of sources, each with varying ranking factors but all from relevant web pages where possible.

Blended link building can be slow and tedious but it does produce great long term website rankings in the major searchengines.

Search Engine Optimization Tools

Search Engine Optimization can be a very frightening for a newcomer. There are many tools and resources to help you along with your search engine optimization adventure. It’s up to you to help you pick the right tools to make that adventure as smooth as possible. SEOmoz is a tool that has become an industry standard and highly recommended for audiences of all experience levels. SEOmoz will help you with your website rankings in all major search engines.

SEOmoz will help determine how powerful each page of your site actually is and the overall value. This site will provide foryou the knowledge that you need to help you site become noticed. Search engine optimization has become such a crucial component for the success of your website that it is truly important not to ignore this area of business for you growing organization. SEOmoz can help you learn insider tips and tricks that most professionals apply in everyday use. No tool out there packs this much into one website. SEOmoz also has a series of guides to help you with your social media marketing and professional link building. Both of these areas are crucial to the survival of your online business. The days of building a website and slapping it up on the web and taking a step back are over. Competition is fierce and if you think you don’t have
any think again because you do. Stay ahead fo the game and use your resources.

For more information please visit SEOmoz today and sign up and start taking advantage of with this organization has to offer.

Create Your Own 404 Not Found Page

Does your 404 Not Found page help your visitors or irritate them? You will undoubtedly come across unhappy visitors to your website from time to time. It’s a part of business. And 404 Not Found pages are supposed to help visitors who couldn’t find the web page that they were hoping to find. You don’t want your 404 Not Found page just being a blank page or some useless error message that will send your visitors elsewhere cussing and spittering and sputtering like a rusty wheel.

If your website sits on an Apache web server then you can simply update your .htaccess file and customize your 404 Not Found page. Open you .

Htaccess file and add this phrase to the file:


ErrorDocument 404 /notfound.html

This creates your 404 Not Found page. If you don’t already have a notfound.html file then you’ll need to create one of themas well and put it in the root directory of your web site. If you don’t have an .htaccess file then you’ll need to create one of them and all you need to do is open a blank Notepad file to do so.

If all you have is a blank page that says 404 Not Found then your visitors will hit the back button and never return. To prevent that from happening, you need to help them find the web page they were looking for. Here are some things you can put into your notfound.html file to help your visitors.

1. Make your 404 Not Found page look like any other web page on your website by enclosing in your website template and having your menu bar in its usual location

2. A sitemap link so that visitors can see all the web pages on your website

3. If you’ve created a customized search engine or you have a search function for your website then put a search box on your 404 Not Found page

4. A list of frequently mistyped URLs and their correct locations Anything that will help your visitors find what they are looking for should go onto your 404 Not Found page. Otherwise, your
visitors will leave and not come back.

Why Error Pages Should be Customized

Customizing your error pages is a great way to ensure that your customers know that your website is still around and is merely offline for one reason or another. The fact is that the Internet is a here today gone tomorrow world and when a site is offline, people more often than not, expect the worse.

Customize your error pages…they should be short and informative. You don’t need much in your error page other than your company name and a vague reason why your site is offline. The best error pages are a little ambiguous as to the reason for the site being offline, thus allowing them to be used in as many situations as possible.

If you are unsure of how to customize your error pages, or would like professional advice as to what to put in your customized error page, please contact us at Brick Marketing. We are happy to provide our professional advice and can give you a range of options for a good error page.

Is A 404 Error Page Necessary?

We’ve been conditioned to include 404 error pages on all our websites. But is it necessary? I’ve got a friend who says no.

This friend of mine has been a friend for awhile now. He’s also an search engine optimization expert who’s been doing business online, I think, since before the Arpanet. He probably knows which end is up. At any rate, this friend - I’ll call him Al Gore (but it’s not really Al Gore, you see) - doesn’t use 404 error pages. He redirects his visitors to his blog instead. What that means is any time a visitor types in an URL or lands on a page that would ordinarily produce a 404 error page, they automatically land on his blog. I thought that was pretty creative. The

advantages to doing it this way are:

* Fewer disgruntled visitors leaving your site
* Less confusion about “Which link should I click?”
* You don’t have to build a 404 error page with links to your important pages so that your visitors know what to do
* You might get more regular readers of your blog
* Site visitors will have something interesting to read even if they don’t find what they’re looking for 404 error pages aren’t bad, but in this day of endless blogs, people would rather read your 300 word blog post than your 300 word sorry-you-didn’t-find-what-you-were-looking-for message. Just a little tip from Al Gore

What is Search Engine Marketing

The practice of marketing , advertising & Optimizing a website in search engines is referred to Search Engine Marketing.

SEM methods include

Search Engine Optimization (or SEO)
Paid Placement
Paid Inclusion

Search Engine Optimization

It is process of improving quality traffic to a web site from search engines organic search results.

Search engine’s don’t see pages the same way we doe, there’s no color, no flash, that makes them go “this is a professional page” – you have to make it know what the page is about, and SEO makes sure the search engine sees your page the same way you do

Paid Placement(PPC)

It is an advertising model used on search engines, advertising networks, and content websites, where advertisers only pay when a user actually clicks on an ad to visit the advertiser's website. Advertisers bid on keywords they believe their target market would type in the search bar when they are looking for a product or service.


Paid Inclusion

Paid inclusion is a search engine marketing product where the search engine company charges fees related to inclusion of websites in their search index.

Introducing SEO Ranking Factors

What will make me top? Or more precisely “What are the factors that determine my position in the natural listings for a specific keyphrase and what is their relative importance?”

Wow! Those are the questions that everyone involved in SEO, from clients to agencies, wants and needs to know. So let’s work through some answers.

Unfortunately, the number of people who can definitively answer these questions by concisely explaining the hundreds of factors, and the way they work with one another, is strictly limited to the engineers who work for the search engines.

However, through combining the experience of the authors, the review team and disclosure from the search engines and expert commentators, we have compiled a comprehensive list of the most important factors which determine position in the listings.

What determines ranking position in the natural listings?
The position or ranking in the natural listings for a particular keyphrase is dependent on a search engine’s ranking algorithm.

For the search query entered into the search engine, the algorithm uses rules or heuristics to identify the most relevant pages, based on the page’s text content and its context (which can be indicated by links from other pages and sites).

Each search engine has a different set of algorithms created by engineers who strive to produce the best relevance for its users.

However, the ranking of natural listings has evolved as a science over the past ten years, based on an even longer history of document indexing and retrieval history. As with all sciences, there are fundamental principles which apply. So to deliver relevance search engines tend to use common search engine ranking factors.

Enough with the science: get with the ranking factors…

We will soon enough. But before we do here’s a word to the wise: understanding some of the most common ranking factors is straightforward. Keyword: ‘straightforward’.

Over the past few years many bedroom cowboys and unethical agencies have raked in fees by pretending that SEO is about wearing a black hat, doing the search voodoo, etc. But there is no need for any smoke or mirrors. If your agency refuses to reveal its techniques to you then our advice would be to move on. Immediately!

Despite the mystique perpetuated about SEO it really isn’t terribly difficult to grasp the concepts. The difficulty lies in managing your keywords and optimization over the long term. Not in understanding the ranking factors.

Who in my organization needs to know about this stuff?

It is essential for your technology team / agencies to understand these ranking factors. It is also imperative that all content owners / authors / stakeholders understand how good quality content can improve search rankings. Authors need to know which keyphrases to use, otherwise you won’t have a joined up strategy.

Applying simple ‘house style’ rules can help generate visits from qualified visitors. In fact let’s coin a phrase: ‘house strategy’. You need to develop a ‘house strategy guide’ for your authors and editors.

Types Of Search Engine Marketing

Two different SEM disciplines have developed to help organizations achieve visibility…

1. Search engine optimization (SEO)

SEO is aimed at achieving the highest position practically possible in the organic listings on the search engine results pages. To do this you need to define a list of keyphrases to work with.

In Google, Yahoo! and MSN Search, the natural listings are on the left as shown in Figure 1 and Figure 2, although there may be sponsored links above and below these results (typically differentiated to the consumer by the title ‘Sponsored Links’).

There is no charge for organic listings to be displayed, nor when a link to your site is clicked on. However, you may need to pay a SEO firm or consultant to manage optimization, and the ongoing work needed to make your website appear higher in the rankings.

Other reasons to employ SEO expertise include managing security of content, copyright ownership, reputation management and user experience management.

2. Paid-search marketing

Within paid-search marketing there are two main alternatives:

(a) Paid-search engine advertising (aka PPC / sponsored listings).

These are highly-relevant text ads with a link to a company page and some ad text, displayed when the user of a search engine types in a specific phrase. These ads are displayed in the sponsored listings part of the SERPs as is shown in Figure 1 and Figure 2.

As the name suggests, a fee is charged for every click of each link, with the amount bid for the click determining the position.

Google Adwords factors in a ‘quality score’ based on the clickthrough rate for each ad, meaning that an underperforming ad might not make it to the top.

The most important PPC services are:

  • Google Adwords (http://adwords.google.com)
  • Yahoo! Search services (http://searchmarketing.yahoo.com/) / Overture (http://www.overture.com)
  • MIVA (www.miva.com)
  • MSN Ad center(adcenter.microsoft.com)


(b) Content-network paid-search advertising

Sponsored links are displayed by the search engine on a network of third-party sites. These are typically media-owned sites such as online newspapers or affiliate marketing sites.

Ads may be paid for on the basis of clicks (this is most common) or on the number of ads served (CPM basis).

The most important content network services are:

  • Google Adsense (http://adsense.google.com) for site owners, provided through Google Adwords for advertisers.
  • Yahoo Publisher Network (http://publisher.yahoo.com/) for site owners, provided through Content Match for advertisers.
  • MIVA (www.miva.com).

Paid-search advertising is more similar to conventional advertising than SEO, since you pay to advertise in ‘sponsored links’.

But there are big differences…

  • With PPC, a relevant text ad with a link to a company page is displayed as one of several ‘sponsored links’ when the user of a search engine types in a specific phrase. So, the first difference to conventional advertising is that it is highly targeted – the ad is only displayed when a relevant keyword phrase is typed in.
  • With the PPC approach, you don’t pay for the number of people who see your ad, but you only pay for those who click through to your website (hence Pay Per Click).
  • The prominence of the ad is dependent on the price bid for each clickthrough, with the highest bidder placed top (except in Google, where clickthrough rate is also taken into account)।

Methods of Indexing

There are two important methods of indexing used in web database creation - full-text
and human.

Full-Text Indexing

As its name implies, full-text indexing is where every word on the page is put into a database for searching. Alta Vista, Google, Infoseek, Excite are examples of full-text databases. Full-text indexing will help you find every example of a reference to a specific name or terminology. A general topic search will not be very useful in this database, and one has to dig through a lot of "false drops" (or returned pages that have nothing to do with the search). In this case the websites are indexed my computer software. This software called “spiders” or “robots” automatically seeks out Web sites on the Internet
and retrieves information from those sites (which matches the search criteria) using set instructions written into the software. This information is then automatically entered into a database.

Key word Indexing

In key word indexing only “important” words or phrases are put into the database. Lycos is a good example of key word indexing.

Human Indexing

Yahoo and some of Magellan are two of the few examples of human indexing. In the Key word indexing, all of the work was done by a computer program called a "spider" or a "robot". In human indexing, a person examines the page and determines a very few key phrases that describe it. This allows the user to find a good start of works on a topic - assuming that the topic was picked by the human as something that describes the page. This is how the directory-based web databases are developed.

Web Crawler Application Design

The Web Crawler Application is divided into three main modules.

1. Controller
2. Fetcher
3. Parser

Controller Module - This module focuses on the Graphical User Interface (GUI) designed for the web crawler and is responsible for controlling the operations of the crawler. The GUI enables the user to enter the start URL, enter the maximum number of
URL’s to crawl, view the URL’s that are being fetched. It controls the Fetcher and Parser.


Fetcher Module - This module starts by fetching the page according to the start URL specified by the user. The fetcher module also retrieves all the links in a particular page and continues doing that until the maximum number of URL’s is reached.

Parser Module - This module parses the URL’s fetched by the Fetcher module and saves the contents of those pages to the disk.

Storing & Indexing the web content


Indexing the web content

Similar to an index of a book, a search engine also extracts and builds a catalog of all the words that appear on each web page and the number of times it appears on that page etc. Indexing of web content is a challenging task assuming an average of 1000 words per web page and billions of such pages. Indexes are used for searching by keywords; therefore, it has to be stored in the memory of computers to provide quick access to the search results. Indexing starts with parsing the website content using a parser. Any parser, which is designed to run on the entire Web, must handle a huge array of possible errors. The parser can extract the relevant information from a web page by excluding certain common words (such as a, an, the - also known as stop words), HTML tags, Java Scripting and other bad characters. A good parser can also eliminate commonly occurring
content in the website pages such as navigation links, so that they are not counted as a part of the page'scontent. Once the indexing is completed, the results are stored in memory, in a sorted order. This helps in retrieving the information quickly. Indexes are updated periodically as new content is crawled. Some indexes help create a dictionary (lexicon) of all words that are available for searching. Also a lexicon helps in correcting mistyped words by showi ng the corrected versions in a search result. A part of the success of the search engine lies in how the indexes are built and used. Various algorithms are used to optimize these indexes so that relevant results are found easily without much computing resource usage.


Indexing the web content

Similar to an index of a book, a search engine also extracts and builds a catalog of all the words that appear on each web page and the number of times it appears on that page etc. Indexing of web content is a challenging task assuming an average of 1000 words per web page and billions of such pages. Indexes are used for searching by keywords; therefore, it has to be stored in the memory of computers to provide quick access to the search results. Indexing starts with parsing the website content using a parser. Any parser, which is designed to run on the entire Web, must handle a huge array of possible errors. The parser can extract the relevant information from a web page by excluding certain common words (such as a, an, the - also known as stop words), HTML tags, Java Scripting and other bad characters. A good parser can also eliminate commonly occurring
content in the website pages such as navigation links, so that they are not counted as a part of the page'scontent. Once the indexing is completed, the results are stored in memory, in a sorted order. This helps in retrieving the information quickly. Indexes are updated periodically as new content is crawled. Some indexes help create a dictionary (lexicon) of all words that are available for searching. Also a lexicon helps in correcting mistyped words by showi ng the corrected versions in a search result. A part of the success of the search engine lies in how the indexes are built and used. Various algorithms are used to optimize these indexes so that relevant results are found easily without much computing resource usage.


Storing the Web Content

In addition to indexing the web content, the individual pages are also stored in the search engine'database. Due to cheaper disk storage, the storage capacity of search engines is very huge, and often runs into terabytes of data. However, retrieving this data quickly and efficiently requires special distributed and scalable data storage functionality. The amount of data, that a search engine can store, is limited by the amount of data it can retrieve for search results. Google can index and store about 3 billion web documents. This capacity is far more than any other search engine during this time. "Spiders" take a Web page'scontent and create key search words that enable online users to find pages they're looking for.

In addition to indexing the web content, the individual pages are also stored in the search engine'sdatabase. Due to cheaper disk storage, the storage capacity of search engines is very huge, and often runs into terabytes of data. However, retrieving this data quickly and efficiently requires special distributed and scalable data storage functionality. The amount of data, that a search engine can store, is limited by the amount of data it can retrieve for search results. Google can index and store about 3 billion web documents. This capacity is far more than any other search engine during this time. "Spiders" take a Web page'scontent and create key search words that enable online users to find pages they're looking for.

Robot Protocol

Web sites also often have restricted areas that crawlers should not crawl. To address these concerns, many Web sites adopted the Robot protocol, which establishes guidelines that crawlers should follow. Over time, the protocol has become the unwritten law of the Internet for Web crawlers. The Robot protocol specifies that Web sites wishing to restrict certain areas or pages from crawling have a file called robots.txt placed at the root of the Web site. The ethical crawlers will then skip the disallowed areas. Following is an example robots.txt file and an explanation of its format:

# robots.txt for http://somehost.com/
User-agent: *
Disallow: /cgi-bin/
Disallow: /registration # Disallow robots on registration page
Disallow: /login
The first line of the sample file has a comment on it, as denoted by the use of a hash (#)
character. Crawlers reading robots.txt files should ignore any comments.

The third line of the sample file specifies the User-agent to which the Disallow rules following it apply. User-agent is a term used for the programs that access a Web site. Each browser has a unique User-agent value that it sends along with each request to a Web server. However, typically Web sites want to disallow all robots (or User-agents) access to certain areas, so they use a value of asterisk (*) for the User-agent. This specifies that all User-agents be disallowed for the rules that follow it. The lines following the User-agent lines are called disallow statements. The disallow statements define the Web site paths that crawlers are not allowed to access. For example, the first disallow statement in the sample file tells crawlers not to crawl any links that begin with “/cgi-bin/”. Thus, the following URLs are both off limits to crawlers according to that line.
http://somehost.com/cgi-bin/
http://somehost.com/cgi-bin/register (Searching Indexing Robots and Robots.txt)

Competing search engines Google, Yahoo!, Microsoft Live, and Ask have announced today their support for 'autodiscovery' of sitemaps. The newly announced autodiscovery method allows you to specify in your robot.txt file where your sitemap is located.

Crawling Techniques

Crawling Techniques

Focused Crawling

A general purpose Web crawler gathers as many pages as it can from a particular set of URL’s. Where as a focused crawler is designed to only gather documents on a specific topic, thus reducing the amount of network traffic and downloads. The goal of the focused crawler is to selectively seek out pages that are relevant to a pre-defined set of topics. The topics are specified not using keywords, but using exemplary documents. Rather than collecting and indexing all accessible web documents to be able to answer all possible ad-hoc queries, a focused crawler analyzes its crawl boundary to find the links that are likely to be most relevant for the crawl, and avoids irrelevant regions of the web. This leads to significant savings in hardware and network resources, and helps keep the crawl more up-to-date. The focused crawler has three main components: a classifier, which makes relevance judgments on pages crawled to decide on link expansion, a distiller which determines a measure of centrality of crawled pages to determine visit priorities, and a crawler with dynamically reconfigurable priority controls which is governed by the classifier and distiller. The most crucial evaluation of focused crawling is to measure the harvest ratio, which is rate at which relevant pages are acquired and irrelevant pages are effectively filtered off from the crawl. This harvest ratio must be high, otherwise the focused crawler would spend a lot of time merely eliminating irrelevant pages, and it may be better to use an ordinary crawler instead.

Distributed Crawling

Indexing the web is a challenge due to its growing and dynamic nature. As the size of the Web is growing it has become imperative to parallelize the crawling process in order to finish downloading the pages in a reasonable amount of time. A single crawling process even if multithreading is used will be insufficient for large – scale engines that need to fetch large amounts of data rapidly. When a single centralized crawler is used all the fetched data passes through a single physical link. Distributing the crawling activity via multiple processes can help build a scalable, easily configurable system, which is fault tolerant system. Splitting the load decreases hardware requirements and at the same time increases the overall download speed and reliability. Each task is performed in a fully distributed fashion, that is, no central coordinator exists.

Crawler Based Search Engines

Crawler based search engines Their listings automatically. Computer programs ‘spiders’ build them not by human selection. They are not organized by subject categories; a computer algorithm ranks all pages. Such kinds of search engines are huge and often retrieve a lot of information -- for complex searches it allows to search within the results of a previous search and enables you to refine search results. These types of search engines contain full text of the web pages they link to. So one can find pages by matching words in the pages one wants.

HOW SEARCH ENGINES WORK

Search engines index tens to hundreds of millions of web pages involving a comparable number of distinct terms. They answer tens of millions of queries every day. Despite the importance of large-scale search engines on the web, very little academic research has been conducted on them. Furthermore, due to rapid advance in technology and web proliferation, creating a web search engine today is very different from three years ago. There are differences in the ways various search engines work, but they all perform three basic tasks:

1. They search the Internet or select pieces of the Internet based on important Words.
2. They keep an index of the words they find, and where they find them.
3. They allow users to look for words or combinations of words found in that index.

A search engine finds information for its database by accepting listings sent in by authors who want exposure, or by getting the information from their "web crawlers," "spiders," or "robots," programs that roam the Internet storing links to and information about each page they visit.

A web crawler is a program that downloads and stores Web pages, often for a Web search engine. Roughly, a crawler starts off by placing an initial set of URLs,
S0, in a queue, where all URLs to be retrieved are kept and prioritized. From this queue, the crawler gets a URL (in some order), downloads the page, extracts any URLs in the downloaded page, and puts the new URLs in the queue. This process is repeated until the crawler decides to stop. Collected pages are later used for other applications, such as a Web search engine or a Web cache.

The most important measure for a search engine is the search performance, quality of the results and ability to crawl, and index the web efficiently. The primary goal is to provide high quality search results over a rapidly growing World Wide Web. Some of the efficient and recommended search engines are Google, Yahoo and Teoma, which share
some common features and are standardized to some extent.

Search Engine Optimization - Basics - Domain Name Basic Seo tips

Basic Seo tips

Domain Name : Seo optimization starts right from the domain name booking. Believe it or not your domain name has a large amount to do with your ranking in Google.

Using a keyword in the domain name is only helpful if you separate the words with hyphens. General speculation is that too many hyphens might trigger a trust issue with the domain, so more than one or two hyphens is not recommended. A good brand name is always better than a keyword-filled domain.
  • the length of the domain registration (one year <-> several years)
  • the address of the web site owner, the admin and the technical contact
  • the stability of data and host company
  • the number of pages on a web site (web sites must have more than one page)
Google claims that they have a list of known bad contact information, name servers and IP addresses that helps them to find out whether a spammer is running a domain.

Advantages & Dis Advantages of SEO and SEM

Few would argue that SEO is potentially the most important search marketing approach for marketers since most searchers click on the natural listings.

Indeed, research shows that some searchers NEVER click on the sponsored listings. Others still don’t realise these links are paid-for.

Generally, the 80:20 rule holds true with 80% of the clicks on natural listings and 20% of the clicks on the paid listings as suggested by the first Stats box earlier in this section.

A key benefit of SEO is that it is relatively cost-effective since there is no payment to the search engines for being placed there. This is particularly important for the ‘search head’, the high-volume, low-intent phrases shown in Table 1 which can be expensive in paid-search. But it can also be useful for generating visitors on the long tail of search shown in Figure 7. Many companies bid on these phrases through paid-search, so giving opportunities for those who use an SEO strategy for the tail.

Additionally, the cost of SEO is relatively fixed, independent of click volume. Effectively, the cost per click from SEO declines through time after initial optimization costs and lower ongoing optimization costs. Conversely, paid-search is essentially a variable cost.

So, there are no media costs, but resources are necessary for key phrase analysis and to complete optimization on the website pages.

Together with paid-search it can also offer a highly targeted audience – visitors referred by SEO will only visit your site only if they are looking for specific information on your products or related content.

Disadvantages of SEO

The challenge of SEO is that there are over 8 billion pages in the search engine indexes with your position in the SERPs dependent on a constantly changing algorithm which is not published. So making your pages visible may require specialist knowledge, constant monitoring and the ability to respond.

As a consequence, the biggest disadvantage of SEO is a lack of control. You are subject to changes in the algorithm.

There are other possible issues. You may be prevented from competing on a level playing field, because competitors and even affiliates may use less ethical black hat SEO techniques.

In competitive sectors it may be very difficult to get listed in the top few results for competitive phrases. This is when PPC may have to be used, although this can be expensive in a competitive sector.

This lack of visibility makes it difficult to make a definitive business case for SEO, although it is fairly obvious what a sought-after number one position on Google would do for most companies.

It is nevertheless impossible to predict and guarantee positions and click volumes from SEO, because the impact of future changes to the algorithm is unknown. Ditto competitor activity – you don’t know what they’ll be doing in future.

So, for a given investment of £1, $1 or €1 it is difficult to estimate the returns compared to paid-search, or indeed traditional advertising, or direct mail, where more accurate estimates are possible.

However, we will see that estimates of long-term returns from SEO can and should be made.

Key recommendation 3. SEO is a long-term strategy. To identify the correct investment requires a long-term cost/benefit analysis. If this doesn’t occur, SEM strategy is often imbalanced in favour of SEO.

Technical disadvantages?

Technical constraints may also limit your SEO capabilities – for example, if there is not the right IT resource, knowledge or technology available to implement the changes to site structure and content mark-up needed for SEO.

For example, websites created entirely using Flash cause readability problems for search engine robots, so onsite optimization is somewhat redundant.

Content disadvantages?

There is a clear need for better education among content authors. They need to know what keyphrases to use, and where to use them, whenever they add and update content.

Balance is required when authors create pages, since they are being created for both search engines and humans.

Copy and language which is effective for SEO can be different to naturally written copy, although the search engines seek to identify and reward natural language. There needs to be a compromise and subtle balance between the two so that pages are intelligible to users, but are also great search engine fodder.

The mantra is to write for users, but to label content accurately for Googlebot.

Because of these problem areas many companies focus their online marketing strategy on PPC. Ad buying and planning remains the staple diet of marketers, so buying PPC ads comes naturally. Indeed, PPC is often the first step into the world of search for many ‘offline’ marketers, the lowest hanging fruit. ROI from paid-search can be excellent, but you mustn’t allow these potential problem areas – or the ease of buying PPC ads – to distract you from the joys of organic search optimization.


Key recommendation 4. SEO is not purely a technical discipline to be conducted by a specialist team or agency. It requires a different style to traditional copywriting which requires training of content owners and reviewers.

Paid-search advantages

• Predictability. Traffic, rankings, returns and costs tend to be more stable and more predictable than SEO. It is more immediately accountable, in terms of ROI, while SEO can take much longer to evaluate.

• More straightforward to achieve high rankings – you simply have to bid more than your competitors, although Google also takes the Quality Score of your ad into account. SEO requires long-term, technically complex work on page optimization, site restucture and link-building, which can take months to implement and for results to occur.

• Faster. PPC listings appear much faster, usually in a few hours (or days if editor review is required).

• Flexibility. Creative and bids can also be readily modified or turned off for particular times. The results of SEO can take weeks or months to be achieved. (Content modifications to existing pages for SEO are usually included within a few days). PPC budgets can also be reallocated in line with changing marketing goals (eg: a bank can quickly switch paid-search budget from ‘loans’ to ‘savings’).

• Automation. Bid management systems can help financial predictability through using rules to control bidding in line with your conversion rates to reach an appropriate cost per sale. However substantial manual intervention is required for the best results for different search ad networks.

• Branding effect. Tests have shown that there is a branding effect with Pay Per Click, even if users do not click on the ad. This can be useful for the launch of products or major campaigns.

Paid-search disadvantages

• Competition. Since Pay Per Click has become popular due to its effectiveness, it is competitive and because it is based on competitive bids it can get expensive. CPC/bid inflation has led to some companies reducing PPC activity. Some companies may get involved in bidding wars that drive bids up to an unacceptable-level – some phrases such as ‘life insurance’ may exceed £10 per click.

• Higher costs. If SEO is effective it will almost always deliver a lower CPC.

• Favours big players. For companies with a lower budget or a narrower range of products on which to increase lifetime value it may be not possible to compete. Large players can also get deals on their media spend through their agencies.

• Complexity of managing large campaigns. PPC requires knowledge of configuration, bidding options of the reporting facilities of different ad networks. To manage a PPC account may require daily or even hourly checks on the bidding to stay competitive – this can amount to a lot of time. Bid management software can help here.

• Missed opportunities. Sponsored listings are only part of the SEM mix. Many search users do not click on these, so you cannot maximise the effect.

• Click fraud is regarded by some as a problem, especially in some sectors. Click fraud is covered in detail in the E-consultancy Best Practice guide to Paid-search, to be published in the summer, 2006.

Search Engine Marketing VS Pay Per Click

When you pay for top ranking in the "Sponsored" are from a search engine, this is called "pay-per-click" (PPC) search engine advertising (or PPCSE). Pay-Per-Click Search Engine Advertising allows you to quickly get top search engine placement by "bidding" (paying) for keywords related to your product or service.


"Organic" or "Natural" search engine optimization (SEO) is accomplished by optimizing your web pages, adding keyword rich content and by increasing your "link popularity" by acquiring or paying for links that point to your web site. This gives you high rankings at the Search Engines for your chosen search terms.


Promoting your internet business can be a tough task. The costs of traditional advertising are prohibitive. It can cost as much as $50,000 to run a print advertisement in a prominent publication. It can cost much more than that to produce and run a TV commercial, especially if the commercial airs during peak viewing hours. Online marketing is a problem too, because it is extremely difficult and takes a long time to climb to the top of the search engine rankings.


So, what should you do? Well, you will need to do rely on the same thing all internet businesses do when they first start out. Of course, we are talking about pay-per-click (PPC). By advertising with pay-per-click, you pay a certain price per click to be listed near the top of the first page of the search engines for your chosen keyword or phrase. Every time someone clicks on your PPC ad, you pay for that click.


If you have never used PPC before or do not know what it is, perform a search on any search engine and you will notice that at the top or to the side of the search results you will see a section called "sponsored links." These are web sites that are paying a certain amount per click to be listed there.

This article explains the strengths and weaknesses of both methods of search engine marketing.


"Natural" Search Engine Optimization

The biggest misconception about natural search engine marketing is that it's easy and can happen quickly. The price you pay is determined by how competitive your keyword phrases are and how many other sites you have to climb above. If your domain name has just been recently purchased and hasn't been on the Internet for long, you can expect to wait a minimum of six to nine months before the major search engines like Google even consider picking your site up unless you take some aggressive steps to get your site indexed right away.


Yet still, natural search engine optimization usually gives you a much higher return on investment than pay per click. This is true for two main reasons:

  1. More searchers click the natural search engine results versus the pay per click ads, so you'll get much more traffic for less.

  2. One of the biggest factors to improving your rankings with natural search engine optimization is by boosting your "Link Popularity" by acquiring or paying for links that point to your web site. These links give you lasting results by giving you top rankings and traffic from the search engines. Plus, the links themselves will provide a significant boost in long term traffic.


With that said, the biggest weakness of natural search engine optimization is the time required to generate links and "tweak" your web pages and keywords to get those prized high rankings you so desire. It can literally take 3 months or more to finally enjoy the benefits of your search engine optimization campaign.


Pay Per Click Search Engine Advertising


The biggest benefit of pay per click is the fact that it will provide you with an immediate boost of qualified visitors, lead and sales giving you fast results within just hours or days. In fact, a pay per click advertising program is your best option if you seek fast results and a good return on investment while you are waiting for your Search Engine Optimization (SEO) program to "ramp up."


Depending on your traffic goals, you can budget $100 or $100,000. PPCSEs also give you the added benefits of being able to quickly test your web site and track your conversion rates (leads, opt-ins, and sales) and turning keywords (visitors) on and off easily.


PPC can be very expensive depending upon what keyword you want to receive clicks for, but there are ways to budget your money wisely so you can maximize the effectiveness of your PPC marketing campaign without having a ton of money to spend. Some of the companies at the top of the sponsored links section might be bidding up to $20 per click for certain keywords. Insurance companies such as Geico and Progressive often bid up to $25 per click for the keyword "auto insurance."


However, for most keywords, you can bid relatively low and still get a lot of clicks. Each PPC service has a traffic calculator that tells you how many clicks you will receive given a certain bid price and daily budget. So, if you want to spend $0.50 per click, you can put that bid into their traffic calculator and it will tell you how many clicks you can expect to receive at $0.50 per click and how much that will cost per day. The traffic calculator will also tell you what your position will be (the higher your bid price, the higher your position will be within the sponsored links section for that keyword).


There are several PPC programs that you can use to receive traffic. There is Google Adwords (spans several engines, including Jeeves, AOL, and Google), Overture (Yahoo and other engines), Miva, and many others. They are all reliable, and they will all deliver traffic to your web site.


The most important part of PPC advertising is knowing what keywords or phrases to bid on. You have to remember that most people using a search engine are only looking for information, and are not seeking to buy a product or service. So, if you are using PPC to get traffic to your web site in order to sell something, make sure you bid on a keyword that will bring you customers who are looking to make a purchase and are not there just to glean information.


For example, if you are selling attorney services on your web site, and you bid on the keyword "medical malpractice", you are going to receive a lot of traffic from people who visit your web site merely to read what is there, because they are probably just looking for information about medical malpractice laws and do not want to pay for your services. Instead, you should bid on the phrase "Denver medical malpractice attorney" or "Denver medical malpractice lawyer". In this manner, you will only receive qualified traffic from people that are looking to retain your services, which will increase your sales and allow you to get more bang for your marketing buck.


So, as a short term strategy pay per click gives you the clear advantage over SEO. But, the disadvantage is the cost involved. Depending on the market demand for your keywords and clicks, your PPCSE campaign can generate tons of traffic and can cost hundreds, even thousands per day. With various optimization strategies you can lower your costs, but over the long term natural search engine optimization will give you a higher return on your marketing dollar.


Pay Per Click Marketing

There are many tools and methods available on the market today that one can choose from but pay per click is an important tool that can boost your search engine marketing in no time. Ideally, the pay per click method is an advertising tool that should be used until your natural search engine optimization takes firm hold for the chosen keywords.


Well selection of a right keyword is very important for doing an online business and keeping up the pace with the competitors. Users always make use of certain keywords for searching products and services. The professionals from Comet SEM work towards those keywords that can be useful as well as helpful for the growth of your business. We use the best tools for selecting the proper and the most appropriate keywords for your site.


The market of pay per click is very competitive and we constantly work on this to keep your website running all the time. In fact, Comet SEM is further responsible for making your website felt and visible on various search engines like Yahoo and Google. We blend your pay per click campaign with our natural search engine marketing methods to give you the best marketing services. The company works within a budget to get more and more traffic to your site. That is not all; we are proficient in converting visitors to your site to customers.


Comet Search Engine Marketing utilizes all the legal aspects while getting engaged in pay per click marketing method for you. There is absolutely no chance of your site being banned for any reason. Since, Comet SEM does not use or promote illegal practices that have been the forte of practices for various other search engine optimization firms, as we use all the white hat methods for taking your site higher and higher in the world of search engine rankings.


We all want to implement the best techniques and tools available on the market to do to produce great profits from our online endeavors. In fact, we turn all the stones up side down to achieve our goals. Many people doing business online these days are on a constant search for better and innovative business ideas and approaches, website designers and website optimizing companies for results-oriented business solutions. If you are pulling out your hair trying to come up with fantastic ideas of how to make your online venture a success then come to the Comet Search Engine Marketing firm and turn your dreams into a reality.

PER PAY CLICK

What is Pay Per Click?


As the term itself suggests, for every click a payment is made based on the bid amount that your site has put on a particular keyword. At comet SEM we implement pay per click campaigns to make your site visible at the top of sponsored listings on major search engines like Yahoo, Google, Miva and MSN. If you have a brand new site and want to tell visitors about it, pay per click is the best way for you to do it.


Ideally pay per click means advertising about particular goods and services in the internet through using the most relevant and appropriate keywords. Pay per click also allows you to try new terms before you build a natural compaign for them. Rankings for pay per click are always up for grabs if you are willing to spend more. The more you earn the more you get is the success mantra of pay per click campaigns.


At Comet, we constantly keep studying a site to find out how the site is faring in the sponsored listings. In the case that we find that a particular keyword is not doing good for you, we replace it with another appropriate keyword. All fields have competition and a great deal of your competition may be really stiff so we always keep ourselves updated about the latest trends to keep your site one step ahead of your competitors.


Keywords are the corner stone for the success of a pay per click campaign. At CometSEM we perform proper keyword research to determine the best and the most relevant keywords to describe your product. A properly organized Pay Per Click campaign is the idea vehicle to bring targeted visitors to your site in a short time.


We make sure to use the most appropriate keywords for advertising about your products and goods. Yes there is always a chance that visitors just click on your site displayed just because your site happens to be listed at the top. This is inevitable and something you will have to accept if you want to leverage this type of advertising campaign to make your site convert.

Basic SEM

Search Engine Marketing efficiently increases your website's visibility on the Internet. Our search engine marketing firm believes that everybody's business is our business. At Comet, we use innovative " White Hat" search engine marketing techniques to drive an immense amount of traffic to your site. The secret to search engine optimization is timing. At Comet Search Engine Marketing, we simply know what to optimize, when to optimize it, and how to optimize the things necessary to achieve high search engine rankings for our customers.

  • Search Engine Optimization (SEO) - Search engine optimization is the practice of applying techniques to maximize your ranking in organic, or natural search results. Organic search results are the rankings of Web pages returned by a search engine when you search for a specific word or phrase - a "keyword" or "keyword phrase".
  • Pay Per Click Advertising (PPC) - Ads you place for your web site with a search engine, such as Google or Yahoo. You bid the amount you are willing to pay per click. The more you bid, the higher your ad will appear in the search engine results.