Head September 2009 | Basic SEO | SEO Tips | SEO Factors | SEO Analyst Chennai

Social Bookmarking Secrets | Effective way of Doing Social Bookmarking

In Web.2 Concept Social Bookmarking has it's own place in the internet even the social media is getting raised. It's a process where people can find links and bookmark or favorite those links and also can vote for the links. More votes for a link means the popular bookmark it is.

Most of the guys even don't the right process that how to do social bookmarking. It's also an art, if you know that how to perform the art perfectly then your profile would be get more visits and your links would get more votes.

I would like to explain you the right way that how you need to perform social bookmarking. Step by step described follow :
  • Update entire profile will complete info which looks professional
  • Search authorized profiles then vote and comment on their links
  • Add Authorized profiles in to your friends
  • Build a professional relationship with them
  • First to vote and comment on their links
  • In result you will get good comments and votes for your links
  • Provide different kinds of good links which catches public interest.
List of Social Boookmarking sites :
  • Digg.com
  • Reddit.com
  • Hotklix.com
  • Zimbio.com
  • Wikio.com
  • Current.com
  • Shoutwire.com
  • Stumblepon.com
  • Technorati.com
  • Del.icio.us
  • Propeller.com
  • Simpy.com
  • Slashdot.org .......... etc...
Main problem here is most of the people will create profiles and submit their own website links again and again in the same profile, it causes to ban account by the site administrator and also you will get a bad impact in the internet world. Hope you guys are clear with instructions if not let me know. So that i can correct myself. Feel free to ask questions. Looking forward for your comments.

Anchor Text Optimization | On Page SEO Techniques

Getting quality keyword anchored backlinks is the name of the game in SEO.

When obtaining backlinks to your site, it is important to include you keyword as the text for the link! Google uses the anchor text to make choices about the content of your page and how to rank you for certain keywords.

5 Ways to Anchor Your Links Like a Newbie

1. Use “here” / “click here” / “my website” when you link to your site. Congrats, you just told Google your site is about “here” or “click here”; not “clocks”, “clothing”, “video games”, “credit cards”, or “loans”.

2. Using your name as your primary anchor text. You’ll rank really well for your name, but you’re going to have one hell of a time making money from that. When was the last time you saw massive demand for “Frank”?

3. Using your business name. So you’ll rank really well for your business name, but this will do nothing but provide traffic from people who already know who you are. What about bringing in new people? Enough people will link to you with your site / business name, so you’ll earn the top spot for your name eventually by default, especially if your business name is your URL.

4. Anchoring your online nickname. This is even worse than your name. Who is ever going to search for CoolDuDz1978 or something crazy like that?

5. Anchoring the same thing over and over. This is the next step newbies take. You tell them “anchor your keyword” and they go get 1000 exact keyword anchored links over night. Google bomb much? Google will see this as spam and could actually hurt you. Mix it up.

5 Ways To Anchoring Links a Pro

1. Anchor your keyword, related keywords, and long tail keywords.
2. Anchor deep pages with keywords.
3. Develop a “keyword” nick name. *cough* *cough* SEO Tips & Tricks / Basic seo Factors *cough* *cough*
4. Develop a keyword sitename. Court’s “Internet Marketing” School. <– f’ing great.
5. Get creative with your anchor text. My friend’s company he works for is “Paramore Redd”. They always anchor Designed by: Parmore Redd. Why not anchor something like “P|R Nashville Online Marketing”?
6. Use your keyword in your site title, like “How to Make Money Online“, that way people will use the keyword as the anchor text when they refer to you.

Looking at someone’s anchor text really shows how much they know about SEO. And by links, I’m talking about links pointing to your site from other sites organically. Don’t get this confused with the type of linking Court is talking about in his post How To Make Money Posting Links On Google. Those type of links aren’t real links at all, but are contextual advertisements. Those “google links” are actually Adsense ads. But anways, the role anchor text plays in your SERPS success is HUGE. Do not get caught anchoring like a newbie. Put some thought in creative ways to get the anchors you want. Of course, sometimes you have to go with the site name or your name, but try to get as many keyword anchored links as you can.

Google Chrome 3 is Released | Download Google Chrome 3

Two weeks ago Google has been launched Google Chrome 2 and now Google Chrome 3. Yes, it seems that Google is in hurry to compete in versions with other Internet browsers applications. There is no so much difference in Chrome 3 as compare to the last released version.

Published Chromium developers release notes for Chrome 3 describes 16 features which are fixed in Chrome 3 that means there are no enhancements in browser. Some of fixes are multiple crash fixes, book marks extension, tab enhancements, localhost problems fixings etc. Google Chrome 3 is just released for MS Windows PCs. As there is no prominent distinctive line in functions of Chrome 2 and Chrome 3, so read Chrome 3 release notes before moving for new version.












A feature you won't probably use too often, at least for now, is the support for the HTML5 video and audio tags. Like Firefox 3.5, Chrome includes video codecs that allow you to embed videos without using slow and unreliable plug-ins like Adobe Flash. You can test this feature in TinyVid.com, an experimental Ogg video uploading site, or in YouTube's HTML5 demo page, which uses an H.264 video.

One year after the first release, the numbers are impressive: "51 developer, 21 beta and 15 stable updates and 3,505 bugfixes". Google Chrome's market share is 2.84%, according to Net Applications, but the browser's impact was even more significant: Chrome set a high standard for browsers by focusing on speed, a simplified user interface and by handling web pages as if they were applications. Safari 4, as well as the the next versions of Firefox, are influenced by Google Chrome's simplicity.

In other Chrome news, the documentation for creating extensions is now available and the support for extensions is enabled by default in the dev channel. If you use the stable version of Chrome, you need to wait a little bit.

Download Google Chrome 3

Sitemap Autodiscovery | Submit XML Sitemaps with Robots.txt

Search engines and robots.txt:

Since the birth of internet search engines, the robots.txt file has been how webmasters could let search engines like Google know what content should get crawled and indexed. However, as part of Google Sitemaps, later named XML Sitemaps Protocol, the usage was expanded with Sitemaps Autodiscovery. It is now possible for webmaster to direct search engines to the website XML sitemap. The moment a search engine has found your website and the robots.txt file, it will also know where to find your XML sitemap.

Submit your XML sitemaps:
In the beginning XML sitemaps submission required you had created and verified a Google Webmaster Tools account. You also had to submit your sitemap files manually. Now, instead of submitting XML sitemaps to all search engines individually, you can be done with them all in seconds.









As see from above, to add XML sitemaps autodiscovery to a robots.txt file, add the fully qualified XML sitemap file path like this: Sitemap: http://www.example.com/sitemap.xml.

Complete robots.txt example for XML sitemaps autodiscovery
User-agent: *
Disallow:
Sitemap: http://www.example.com/sitemap.xml

If you have created a sitemap index file, you can also reference that:
User-agent: *
Disallow:
Sitemap: Sitemap: http://www.example.com/sitemap-index.xml

Manual XML sitemap submission
There can be good reasons to submit your sitemaps manually the first time, e.g. to get acquainted with the different search engine and webmaster tools available:

* Google Webmaster Tools
* Live Webmaster Tools (MSN)
* Yahoo SiteExplorer

Advanced manage and submit sitemaps
In the beginning no search engines supported cross submit multiple websites in one XML sitemap file. However, now most include support for new ways of managing sitemaps across multiple sites. Requirement is you need to verify ownership of all websites:

* Sitemaps protocol: Cross sitemaps submit and manage using robots.txt.
* Google: More website verification methods than sitemaps protocol defines.

Read More Articles Visit http://www.xml-sitemaps-generator.com

Google Introduced | Recent Google Search Results

Google Introduced New Factors, Find your Indexd pages in google with in second, minutes, hour etc, Example: Indexed Pages within, 46 seconds, Minutes, Month, Week, Hour these are the Factors, google introduced please read this story

Ran Geva noticed that Google's date range restrictions have been extended and you can now find web pages indexed by Google less than one minute ago or even less than 10 seconds ago.

(Update. Google doesn't necessarily index web pages as soon as they're published, but the sites that use feeds or sitemaps are indexed pretty fast. With recent advancements like PubSubHubbub that provide real-time notifications for updates, the
delay between publishing pages and finding them using Google will be further reduced.)

Click on "show options", select "past 24 hours" and tweak the URL by replacing "tbs=qdr:d" with "tbs=qdr:n" to find pages indexed in the past minute.














The date restriction feature is quite flexible, but you need to know the syntax used by Google's URLs:

tbs=qdr:[name][value]

where [name] can be one of these values: s (second), n (minute), h (hour), d (day), w (week), m (month), y (year), while [value] is a number.

To find the web pages indexed less than 45 seconds ago that include the word "flu", use this URL:

http://www.google.com/search?q=flu&tbs=qdr:s45

Unfortunately, if you restrict the results to very recent web pages, Google shows a small sample and doesn't list all the results.

Example: a search for [Tiger Woods] restricted to almost real-time results.

2009 New Google Updates| Google Grows a Larger Search Box on Home Page

Google Introduced the New Home page for Keyword Suggestion, we see clearly now, the keyword what ever Search, that is. For us, search has always been our focus. And, starting today, you'll notice on our homepage and on our search results pages, our search box is growing in size.

Although this is a very simple idea and an even simpler change, we're excited about it — because it symbolizes our focus on search and because it makes our clean, minimalist homepage even easier and more fun to use.

The new, larger Google search box features larger text when you type so you can see your query more clearly. It also uses a larger text size for the suggestions below the search box, making it easier to select one of the possible refinements. Over the past 11 years,

we've made a number of changes to our homepage. Some are small and some are large. In this case, it's a small change that makes search more prominent.
See this Old Google Home Page

Google has always been first and foremost about search, and we're committed to building and powering the best search on the web — now available through a supersized search box.

Read More Articles Visit http://googleblog.blogspot.com/

Google Caffeine New Search Index & Algorithm | Google Answer to Bing

Microsoft launched its new search engine Bing in June, positioning it as a “decision engine” which delivers more than just page results. They also parnered with Yahoo recently gaining license to Yahoo’s search technology. This will allow Microsoft to extend its new search engine Bing to all of Yahoo’s Web properties. Microsoft has already started to steal some search engine market share from Google. Facebook is also giving competition to Google by releasing a new realtime search engine.

Google is, however. not taking these developments lightly. For last several months, Google has been secretly working on a new project - codenamed Caffeine - the next generation of Google Search engine. This is not going to be a minor upgrade, but an entire new architecture for Google search engine. In short, it will be a completely new version of Google.


The next generation search architecture dubbed Caffeine - is still under development, but Google has released a beta version of Caffeine for public. Though very little is known about Caffeine’s algorithms and inner workings, its launch has come at the right time when by Microsoft is making an all out effort to grab a share in the search engine marketplace, where it is currently in the third position.

As per statement released by Google, the objective for the new version of Google Search is to improve its size, indexing speed, accuracy, and comprehensiveness. There is no confirmation from Google about when it would formally launch the new Caffeine architecture.

We tested Caffeine for few keyword phrases, and found out that it is much more faster, and has more indexed pages. Search results are obviously little different than the ones you get in the current version of Google. Why don’t you take a test drive yourself to see how the new Google search engine works?

2009 SEO New Updates | Google Changed New Tactics for SEO

Some of the SEO changes that took place in 2009 are :

Google improves flash indexing :

Google has enhanced its search engine’s capacity to index Adobe’s Flash files, which are very popular on the Web but tricky for search engine spiders. Google and Adobe’s created a new algorithm which now indexes text content in Flash. As a result Google Bots now indexes textual content in .SWF files of all kinds and extracts URL’s embedded in Flash.

Canonical Link Element:

In short, the canonical element is a line of code that you add to pages that may be duplicates. The canonical link element rel=”canonical” is added to a page within the header () of a page, typically by developers, as follows

In this code, you designate the “canonical,” or “proper,” URL. Engines, in turn, note this URL and apply link popularity and authority to the canonical version instead of applying them to duplicate URLs.

This provides a hint to a search engine about which is the most important or original page if pages are identical or similar in terms of content, but have a different URL.


Two major updates were :

i) Regular PR and Backlink Updates :

Usually a Google PageRank update occurs every three months. So everyone was expecting an Page rank update in the month of march. Lots of rumors and gossips were spread out among webmasters and finally the wait ended up in the first week of April. This time Google crawled for both Page rank and Backlink. But this year Google gave two more surprises as Google danced similarly in the month of June and July which made a great impact on all the working strategies of SEO.

ii) Search Refinements and Snippets :

There were three changes on the Google search result pages which were visible to all the users.

1. The snippet length for the search results has increased. Earlier the snippet was of two lines. Google now displays an extended snippet of 3 lines for queries that consist of three or more keywords. The idea behind this change is that these multi keyword queries are much targeted and complex and hence the short snippet might not contain enough information.

2. Another change is that Google now shows more related searches at the bottom of the search results page. It is very important that you optimize the different pages of your website for different keywords.

3. The third change is that Google now shows local results based on IP addresses. The local results are delivered based on the IP address of the searcher. That means that you will get different results than people in another city.


i) Google Show Options

Google announced a new set of features called as Search/Show Options, which are a collection of tools that rearranges the result so that it can be viewed from different user’s perspectives.


ii) Microsoft New Search Engine BING Launched

On May 28th Microsoft publicly unveiled its soon-to-launch search engine Bing, which was publicly available on June 3. Bing is Microsoft’s new decision engine. The home page features a rotation of stunning photography, for instance, which can be clicked on to produce related image search results. In search presentation, Bing wins. It uses technology from Powerset (a search technology company Microsoft acquired) to display refined versions of your query down the left side of the page. Bing gave a menu of “related searches,” that included Walkthrough, News, and so on.

No More Page Rank Sculpting:

PageRank sculpting is an SEO tactic that involves adding the nofollow attribute to links for which PageRank flow is not necessary. Early in 2005, the main search engines introduced a ‘nofollow’ attribute for the A HREF HTML command for coding hyperlinks.
Pagerank sculpting used to be really effective around a year ago but Finally Matt Cutts announced in his blog that it’s not the most effective way to utilize your page rank and its not recommended.

Webmaster “Summer Shine”

July 2009 Google webmaster team have named there new update as “summer shine”. Webmaster added new feel to the webmaster. Among which some of the major updates are:
1.Site Selector Update
2.Site Link Update
3.URL Remove Request update
4.Home Page Update

Next Generation Architecture for web Search - Google’s Caffeine :

On 10th August Google ask people to help and test there next generation infrastructure. This change was made to improve indexing capability, speed and accuracy of the search results.

Google has provided a preview of the new infrastructure at www2.sandbox.google.com and invited web developers to try searches there.

Avoid 5 Black Hat SEO Techniques

Whether you think Black Hat SEO is bad or not you should avoid it anway, because it can get you banned from the search engines, or at least reduce your ranking. Google has been known to remove sites it felt weren't playing fair. Granted, this isn't likely, but why take that risk? Also, much Black Hat SEO involves some fairly technical work. If this article is your introduction to SEO, you likely don't have the skills to be a successful Black Hatter anyway -- at least one who doesn't get caught.

If you want to stay on Google's good side, here are some things to avoid:

* Invisible text : Don't put white text on a white background. In fact, don't put even very light yellow on a white background. The engines aren't stupid; just because the colors aren't exactly the same doesn't mean they can't figure out there's no contrast. Yes, there are clever ways to try to fool Google about what the background color actually is, but Google is probably aware of most of them anyway, and I won't cover them besides.

* Cloaking : Google knows what's on your site because periodically its automated robot called Googlebot visits all the pages in its index, and grabs all the page content so it can analyze it later. Cloaking means showing one page to Googlebot and a completely different page to real human visitors. Google despises this aplenty.

* Keyword Stuffing : The engines want your pages to be natural. Finding every place to cram your keywords onto your pages -- or worse, including a "paragraph" of nothing but keywords, especially if they're repeated ad nauseum -- is a big no-no. Do you consider pages with lists of keywords to be high quality? Neither does Google.

* Doorway pages : A doorway page is a page built specifically for the purpose of ranking well in the search engines and without any real content of its own, and which then links to the "real" destination page, or automatically redirects there. Doorway pages are a popular choice of some SEO firms, although Google has cracked down on this and many webmasters saw their pages disappear from the index. Some SEO firms call their doorway pages something else, in an effort to fool potential customers who know enough to know that they should avoid doorway pages. But a doorway page is still a doorway page even if you call it something else. Some engines may decide that an orphaned page is a doorway page, and if so then the page or the site might suffer a penalty.

* Spam : Spam has a special meaning with regards to SEO: worthless pages with no content, created specifically for the purpose of ranking well in the engines. You think they have what you're looking for, but when you get there it's just a bunch of ads or listings of other sites. The webmaster is either getting paid by the advertisers, or the page is a doorway page, with the webmaster hoping that you'll click over to the page s/he really wants you to go to.


Read More Articles Visit http://websitehelpers.com/seo/blackhat.html

15 Common SEO Mistakes

1) Focusing on wrong keywords for the site
Now this is the major problem for the sites not ranking well. You have a site for only providing the services for manufacturing of business cards, but you have all the keywords like printing cards, delivering cards etc, that doesn't give any value to your website, rather than focusing on too many non related business keywords or not to the specific keyword.

2) Leaving the Title Tags Empty
I have seen the websites not giving focus to the title tags of the web-page. Title tag is an important tag and it is being displayed in the search results as the title of the web searches for the website. So as a result it becomes important for the titles of the web pages to be unique and present on the page.

3) Not Utilizing the Meta Data
Meta Data, i.e., The Meta Title, The Meta Description & The Meta Keywords, even though they are becoming the things of the past, but having them in your web page not only increases the relevancy of the page but also helps in creating/identifying the theme of the web-page to the spider.

4) Concentrating only on Meta Tags
Now this is a another seo mistake which comes into picture. If you are thinking that providing only the proper meta data to your web-page can put you on the top of the search engine listings, then my dear friend(s) think about it.

5) Java Script Menus
Java Script make the menus very functional, but is not considered as SEO friendly. There are some sites which can't live without the Java Script, so in those cases where you cannot remove the java script menus by css, it's better to provide the menu links in the sitemap or you can also create the footer links for the menus.

6) Lack of Maintenance
SEO is an ongoing process, so once your business start showing up for your keywords which you have optimized the site for, don't just stop. Think of the other ways and methods and processes, look at your competitors and see what they are doing, and create new strategies to maintain your listing on the top.

7) Using Images for Headings
Images do make a site look more attractive and appealing to the users, but using the images in the place of Headings is really a bad idea as far as SEO point of view is concerned.

8) Not choosing proper URLs
Now this may seem too basic thing to be considered, but choosing the proper keyword for your domain is very important factor in SEO. Not only this, while creating sub-pages for the website, it's always better to put keywords in the URL of the pages. All the major three search engines give proper relevance, so there is no point of ignoring the keyword stuffed URL's.

9) More Backlinks are never better
This is another SEO Mistake which many people considers. Having backlinks is important for the SEO, but the more the better concepts does not applies here, rather than the quality matters. So just check the quality of the links which your website is getting, other than checking the number.

10) Lack of Keywords in the Content
You have the content for your web-page, but you lack the keywords in the content for your page, that will do no good to the site. Putting important keywords on the page, particularly the one you are optimizing the web-page for, is very important and is highly recommended as SEO point of view.

11) Missing Robots Tags
Robots tags are as important as title tags. They provide the information to the spider to crawl which section of the website, and to leave which portion of the website.

12) Directory Listing in DMOZ & Yahoo
Google takes it's search results from DMOZ and Yahoo takes from Yahoo Directory. As these are the major search engines, it is always better to get listed in these directories to get major exposure. Even though there is a price of $299 to get listed in the Yahoo Directory, as compared to DMOZ which is free, it's worth submitting it.

13) Improper Anchor Text
It's a good way to use the keywords in the anchor texts and a way to circulate them in the page.

14) Insufficient Content
The content of the site has to be informative and should provide enough information to the user. No content means no traffic.

15) Not Following the Guidelines as Laid by Google, Yahoo & MSN
The major search engines have laid the guidelines for the better indexing and crawling of the websites, not following them leads to the negative result for the websites, check the same here,

Get More Article Visit http://tusharvickkie.blogspot.com

What is Keyword Cannibalization ?

Keyword Cannibalization

Keyword Cannibalization is a process where the web pages of a website compete for each other, for the same keywords or the key phrases. This creates a problem of confusion for the search engine spiders to provide which page in the search results, to the user.

In simple terms it's like repeating the same keywords in the website, to gain advantage in the search results. But it's reverse happens as the spiders cannot decide to show which page as the search result to the user, as everything in front of them seems to be the same. Maximum times this happens by mistake or in order to repeat the keyword throughout the site.

The story behind Keyword Cannibalization

Search Engines consider both the "On Page factors" and the "Off Page Factors" to determine the rank of the web page. On Page factors like the Title Tags and the Content, and Off Page Factors like the number of Links to the web page, the Anchor Text to those links and the Authority of the linking sites, play a vital role in the rank of the web page.

But if you are using the same keywords for multiple pages, then it becomes difficult for the search engines to decide which page to show for the respective keywords typed in the search browser. Moreover, search engines generally don't show multiple pages from the same website, as its not useful for the user. So they end up showing the web page which they feel is the best, and hence Keyword Cannibalization.


The Solution to Keyword Cannibalization

  • If you already have the keyword cannibalization in your website, then it's better to use 301 redirects, from the different pages to the main page.
  • Use unique Title tags for each page.
  • Think every page as a different page and write unique content which appeals to the users.

10 Factors to Promote your Business | Web 2.0 Techniques

Basic Factors Web 2.0 Techniques

Web 2.0 is a very effective platform, which enables users to create, modify and share the information via the world wide web. Web 2.0 or Social Media has lot of significance and can be used to promote your business very effectively.

1. Social Bookmarking - Starting from DIGGING your content to the the social networking websites like Digg, and Technorati which provides an excellent way to distribute your content on the world wide web. This provides a terrific way to promote your website.

2. Content Syndication - Create an informative article, and then distribute it via RSS. It provides an effective way to attract visitors every time you syndicate it.

3. Article Creation - Create informative article and then distribute them to the websites which accepts articles.

4. Press Release Submission - Submitting Press Releases to the websites allows you to promote your business, get more visibility, more traffic and backlink. There are a lot of press release websites which publish the press releases of no charge.

5. Classifieds - Use free classifieds where you can submit your URL. Here is the list of the same,

* Craigslist
* Google Base
* Yahoo Classifieds
* Classifieds for Free
* Text Link Exchange
* Recycler.com
* US Free Ads
* Kijiji


6. Join Social Communities - Social Communities like MySpace, FaceBook etc provides a lot of exposure to your business. Create a social community by joining any of the social site, and make friends i.e., your target audience.

7. Take Part in Yahoo Answers - Ask questions or answer to the questions based on the relevant specific category of your business, in Yahoo Answers. Provide a link to your website either in your profile or in the answers.

8. Forum Posting - Search for relevant forums for your business. Sign Up and then start taking part in the discussions or answer to the questions already on the forum. Make your signature, and then place a link to your website into it.

9. Blog Posting - Creating a blog for your corporate website, provides a tremendous exposure to your business. Moreover, find relevant blogs to your website, and comment on them.

10. An Aricle in Wikipedia - Create an article in wikipedia about your business.


Read More Articles Visit http://tusharvickkie.blogspot.com

SEO Tips & Factors | How to Create Sitemap for your Website

How to Create Sitemap for your Website

There are many third party software and tools available on the internet for creating the sitemaps. Even I do use the tools when I am short of time. But it's always better to learn to know how to create the one manually. Here I am listing the procedure for creating the sitemap for your website.

Basically there are 3-steps for the sitemap,

1. Creating the Sitemap, in .XML file extension
2. Saving the Sitemap in the root directory of the server, and
3. Submitting the Sitemap to the search engine

1) Creating the Sitemap

The sitemap creation is very simple, just need to follow the following steps.

* Open any text editor, like notepad etc, and then save the file as - sitemap.xml

*

* Just copy paste the template as shown in the figure above. Now change the URL to your business domain or website.

* If your site consists of more than 4 pages, then just change copy the URL section of the site map and change the URL's accordingly.

* There can be more attributes for the sitemap, other than the URL, like

1. Changing frequency
2. Last Modified
3. Priority

You can add these details as per your need. For more details go to the suggested reading section down in the post.


2) Saving the Sitemap in the root directory of the website
Save the XML Sitemap in the root directory of the website.


3) Submitting the Sitemap to the Search Engines
This is the third step of the Sitemap creation and is very important. Submit the sitemap to the search engines, and by that I mean to submit it to the major 3 - Search Engines, i.e., the Google, Yahoo and the MSN.

* GOOGLE - Create the webmaster's account, if you done have, or-else sign in to the Google Webmaster Accounts. Submit the Sitemap where it says for Sitemap submission.

* YAHOO - Create an account in Yahoo, or sign in for the same. Add the sitemap by clicking to Submit Site Feed

* MSN - In order to submit the sitemap to MSN, sign in to your MSN's account by going to Webmaster Tools or create one if don't have. Go the section where it says add sitemap, so add it there. And hence you are done.


Read More Articles Visit http://tusharvickkie.blogspot.com

Google Search Engine | Crawling vs Indexing Factors

Crawling vs Indexing Factors

We do a lot of searches monthly, and we do that by putting the search term into the Google search box, and just on the press of the button we are displayed by a zillion search results based on the popularity of the searched term.

Have we ever thought how it happens. Just by entering the search terms we are displayed the highly accurate information. I have tried to mention the concept behind the Google Search Results, by comparing the Crawling and Indexing process of the Google Search Engine Bot.


There are basically three processes to present the search results to the visitors

* Crawling the Website
* Indexing the Website
* Serving the Results


Website Crawling

Crawling in simple terms is the process of finding & adding the new and updated pages to the Google Index. Crawling is done by the Google's software, called as Google Bot (which is also known as the search engine spider or robot, or a bot). The Google's algorithm for crawling the site determines which sites to crawl, how many times, and how many pages from each site.

The crawler process starts with the list of the URL's, generated from the previous crawl and from the Google Sitemaps. The Googlebot visits each of these websites and it detects links on each page and adds them to its list of pages to crawl. The new sites, changes to existing sites, and dead links are noted and used to update the Google index.

It's an automated process, the crawling of the website, and has nothing to do with the payment to crawl the website more frequently.


Website Indexing

Indexing in simpler terms is mintaining list of things. In this aspect indexing means, compiling an index of words, and their location on each page, taken at the time of crawling the website by Google Bot. In addition to this, the Google Bot processes the information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many content types, but not all.


The Search Results
Serving Results is the process where the Google returns the results to the query entered by the user, based on the matches with the indexed pages. The relevancy of the search result depends on a lot of factors. And based on those SEO factors Google returns the search results.

In order for your site to rank well in search results pages, it's important to make sure that Google can crawl and index your site correctly.


Read More Articles Visit http://tusharvickkie.blogspot.com