Head 2009 | Basic SEO | SEO Tips | SEO Factors | SEO Analyst Chennai

On Page SEO Factors

On page SEO factors are important because the search engines rank pages and not web sites. Of course, a web site should be optimised as much as possible throughout all its pages, but each page will be judged on its own merits to a very large degree. Optimising for page SEO factors is the easiest part of overall SEO. It’s the off page factors, getting quality backlinks to the page, that poses the most difficult problem.

If it is the home, or main, page of a web site that you wish to optimise the most, one of the best things you can do to give it a serious boost with the search engines is register a domain that is the exact keyword that the page is optimised for. Having the .com is best, but a .org or .net is pretty much as good too. This can only work for one page on a web site of course, but it is a powerful technique to use.

The first thing you should do for general on page SEO is make the document title in the underlying HTML code the same as the page keyword. You can have additional words in the document title if you wish, but try to place the keyword phrase near the beginning, and keep it fairly short overall.

Your meta tag description is used by many of the search engines in their listings, so use this feature wisely. Use your main keyword in the description near the beginning, and make the description interesting, informative, short and to the point. When a person searching for something come across a listing that has the exact keyword they typed into the search engine in the title, the listing description and in the domain name, they are more likely to click on it that all the others. The search engines will also consider such a page as being highly relevant, and they will rank it accordingly.

The visible part of the page can also be optimised. The page SEO factors that are important here are the main page title and the use of the main keyword in the body of the text, as well as the use of synonyms and related keywords throughout the text. Use an H1 tag to place the page title in. The title should start with the main keyword if possible and can have additional words too.

The body of the text should have the main keyword phrase near the beginning and should then have it included several times more, but not too much. The density of the main keyword phrase should be around 1% to 2% or so. It can be a little higher, but keep it readable. On page SEO factors only need to be there – not overdone.

Top 4 Basic SEO Factors for Website and Blog

Page Load Time and Page Speed emerging as a quality factor : We’ve already seen heated discussions on it and its getting lot of focus these days. It makes sense to me that Google might consider page load times and page speed as a decisive factor in page quality for a site. it may not be a direct influential factor but like a one in hundred factors that decides the quality of a page. I mean, if a page is slick and fast loading it makes lives easier for both bots and users right ? So if there are two sites, one with slower load time and one with quicker load time, then it makes sense that Google might treat the first one as more friendly.

Not saying that sites slower to load will be neglected, but they will sure miss an opportunity. Also, this doesn’t mean that sites on shared hosting servers will take a blow – no. Google probably is thinking of making it fair to al, but the crux will be that faster loading sites, since they have taken the pain in making it an easier user experience, will be merited overall. Let’s wait and watch.

Intra document/web page anchor placement : This is a new concept I’ve learned from Bill Slawski. He talks about a patent being filed that states that search engines (Google) might consider matching search queries with phrases/keywords inside the document/webpage and directly take you to the part on the webpage where the phrase/keyword is present, rather than the webpage itself. This will be a time saving exercise but again, this also might take webmasters aback, as we were not prepared for this.

Social Media Influence on SERPs : Is it there yet ? I don’t think so. What can be done ? I’m not sure yet. But social media influence on SERPs is definitely going to be more important in the coming years, and search engines have to find out methods to clear off noise from signal and device ways to feature socially popular stories/websites within SERPs in a fair way possible. This is something really tricky because there are websites that have not embraced social media yet, so it won’t be fair to avoid them from the SERPs just because of that particular reason. But Social Media sure is an influencing factor, and you got to figure out how to decipher it.

Dynamic content getting more meaningful and SEO friendly : Dynamic content was like quicksand for SEOs because of its complexities and limitations in optimizing. But its high time we recognize that we got to live with it. Not everyone wat to go “textual”and minimalistic, so search engines have to figure out what they can do to decipher dynamic content and make it search engine friendly. Parameter Filter Handling was a good initiative but we need more. article from www.dailyseoblog.com

Tips To Drive Traffic To Your Blog or Website

If you started a blog and you want to make some future, you need to concentrate on your traffic in the start, rather than getting your friends to click on the ads. But how to make people come to your website, read it and come again. Analyzing and detailing it will take almost a year. Here by you got the some helpful tips, posting some quick tips for the new blogger.

Content : You post even just one post a day,it should have content which has quality.Its better to have single good post than 10 bad ones.viewers always appreciate the pages with good
content.You should make your best effort in getting them back to your site again.Make your posts conversational, pithy and topical. Keep them short and stick to one topic per post.Write often and regularly so that both readers and search engines visit your blog more often.

Search Engines : one of the best sources of traffic if you make your site well organised.Come up with nice keywording and you will get good page ranking .The more is your page rank the more the ad provide want to pay to you.Make sure your blog URL has got very good keyword in its title.Use submission sites to submit your URL in all the search engines like google,yahoo..Dont forget to submit your feed in RSS directories.There are sites which has submission directory links in bulk.Go there and submit your feed URL. use feed stats to attract users like the feedburner chicklet i have in my subscribe box.

Signature : post your blog address in the signature in your email making the users know your presence in the web.Go to each and every place where you find a lot of surfers like forums,social networks,community discussion and write your link over there.Write your link in the comments of the most famous blogs.But remember that Google dont count this as a linkback to your site.

Email subscription : Dont forget to give a link of subscribe to updates through email on your blog.
Social bookmarking: Add a link of submit my site to digg, redditt, blinklist, buzz it and ....at the end of each article your write like i have on my blog. Submit your articles to these sites to get best traffic.

Backlinking : Try to contact authors of good blogs and request them for backlinks.It should nt look to them that you are begging,Write a letter like this"I have something interesting for you on my webpage.If you find my site or blog interesting,give me a link o my site on your webpage"

3 Effective Ways to block Google Crawling

Let’s see how we can make Google stay out of crawling selected portions/parts of a website, effectively that is.

Lock and Key Method: The most effective and sure shot method is to use a lock and key mechanism. That is to leave those pages you don’t want Google access, locked behind. Recommended method is to use a username/password login gateway to those pages/part of the website, that you don’t want Google to crawl.

The Nofollow Meta Header tags : The next best thing to Lock and Key method is to use the versatile meta header tags. Pick up those pages that you don’t want Google to crawl, and add the meta=nofollow tag to their headers. Technically this makes Google “skip” indexing the contents of the page. However, this method will be rendered in effective if external websites link extensively to the URL and Google might just “see” the page somehow, while not indexing it still. Also, when you have a large number of pages to be screened out of Google, then adding the header tags can lead to complications.

Blocking the links with NoFollow tags : The third and easiest way to block Google from crawling certain pages/URL is to use the “NoFollow” tags on the links that point to that particular URL. Using Nofollow tags will tell Google that its bots needn’t crawl the contents of a page, as the content may not be useful. Even though this sounds technically okay, sometimes Google can “see through” the nofollow tags.

In my opinion adding Nofollow tags to links are like blocking entry to a room with a Glass wall. Google bots won’t enter the room, or index the contents there, but can very well see through the glass wall and will have an idea about what’s in there. So there you have it. Three effective (and ineffective) ways to block Google from crawling part/sections of your website. Each one works out well according to the condtions and crcumstances, and sometimes even you have to work out all the three based on nature of the pages you’re dealing with. article from www.dailyseoblog.com

Create a SEO Friendly Blog | Important Factors for Creating Blogs

If you want to Create New Blog, First you have to think what kind of Blogs this then how it will help you to gain more traffic and visitors therefore encourage more sales. So just what exactly is SEO and how can you add it to your blog..

What is SEO ?

SEO stands for Search Engine Optimization and used to make your website rank higher in search engines. You do this by using certain keywords that show your site on search engines such as Google and Yahoo. However, it can be difficult getting your website noticed on the major search engines as they usually prefer your website to have been up for at least a year or two and having it linked to from other websites, is also sometimes needed.

Important SEO Factors For Blogs :

If you are looking to use SEO to promote your online home based business blog, you will need to take various factors into consideration. The first is that it is good to use as many keywords as possible to highlight your site, you should never use too many.

If you do use too many keywords, the search engines that you want to help you will actually ignore the site and consider as spam. They do not want you to overstuff your website with SEO content because if you do, it is likely not to be of a good quality. So, you need content in your website that reads well and has a good amount of keywords, but not too many.

The good news about a blog is that you can generally get away with using more keywords than you can in an article. This is because it makes more sense in a blog because it is shorter and it is a summary of whatever it is that you want to say. However, saying that, it would still be a good idea to limit the keywords and ensure that the blog does make sense with them all included. Hence you can improve your, internet business opportunity.

Finally, the main thing to keep in mind with Search Engine Optimization and blogs is that you need to make any external links that you may provide relevant to your website. If, for example, you have a website about cars, it would be pointless having a link to site about office furniture. The search engines will pick up on all of the links that you have and they will simply push your website down the list if they notice anything funny about your submission.

Twitter Making it's Search Better

Twitter is Making it's Search BetterTwitter is a free social networking and micro-blogging service that enables its users to send and read messages known as tweets. Tweets are text-based posts of up to 140 characters displayed on the author's profile page and delivered to the author's subscribers who are known as followers.

Twitter
is planning to make their search better to give better user experience. Santosh Jayaram (Twitter VP of Operations) confirmed that some changes are going to happen in twitter search. He was earlier worked as a VP in Google Search Quality team.

We all should feel glad to know about these changes, because Twitter usually away from major changes. Any ways because of this most of the third party twitter tools making good money. Let them be happy.

Mainly Two things they were looking to change :
  • Indexing links in tweets
  • Reputation ranking system for tweets
Indexing Links in Tweets :

Currently search engines only able to crawl the text in the tweets but not links. They were planning to make the search engines to crawl the links too along with text in the tweets. So that people can get real time twitter results in the search engines too. Especially it gives users better search experience to those who are looking for timely content.

Reputation ranking system for tweets :

We all know that whenever we click on the trending topic, able to see lot of retweets. They are planning to give Reputation ranking system to these tweets. No one knows how the system is designed. Even the twitter engineers also not aware much about this, why because they are still figuring out that which works better. Hope this ranking system will give a great user experience.

The whole world is awaiting for these changes.

Importance of Google Analytics

Google Analytics have major importance in on line business. Tracking the history of visitors and keep doing research on analytics is must and should. We can know about visitors and their views through these analytics.

Analytics usage are increasing day by day. Analytic professionals have higher demand in the market. Everyone can't become an Analytic expert. It needs great thinking, visualization and case study.


Every On line business should take care about :

Bounce rate :

Bounce rate is something which will helps you to know that how visitors are going away from your site. Higher bounce rate means visitors aren't interested on your site. Lesser means visitors are happy to stay on your site. It's always better to make bounce rate 0%, however it's not possible, so it's better to keep it less.

Visit Length :

How long visitors are staying on your site in average? If most of the visitors are staying good amount time on your site means they are reading some thing on your site which is useful for them. So visit length always matters to know the interest of viewer on your site. According to this metrics you can optimize your site.

Visit Depth :

Visit Depth means "Are the visitor is showing interest in visiting the other pages of your site too?" If a visitor visited most of the pages on your site and spent good amount of time on each page says your site awesome. visited all pages and spend less amount of time is not good. Not visited other pages is bad. So higher visit depth helps to grow your business as well as brand.

Returning Visitors :

More returning visits means you have loyal readers for your site. If a visitor come again and again to your site means he liked your site and found it useful. These kind of readers are called as loyal readers. More loyal readers you have, that much popular your site is. It's very important to have more returning visits. Always try to increase the number of normal visits as well as returning visits.

... Most of the search engines will consider all these factors, to show up your website in top search engine results for a particular keyword. So it's important to make note of all these topics and keep working and optimizing it to succeed in your business. Comment your thoughts :) ...

Google Offers Proposal To Make AJAX Crawlable

Google Offers Proposal To Make AJAX Crawlable

It’s one of the most common pieces of SEO advice you’ll find: Don’t build your web site with AJAX if you want it to be crawled and indexed by search engines. AJAX-based web sites are essentially a locked and bolted door when a spider comes crawling.

At our SMX East conference today, Google has announced a proposal to change that, a new standard that would make AJAX crawlable. If it comes to pass, and if the other major search engines go along with the idea, the proposal could serve as a green light for developers wanting to enjoy the rich features of using AJAX while not sacrificing search engine visibility.

The details of Google’s proposal are, to this non-developer, highly technical and more than I care to recap here. (Read Google’s blog post, linked above, for the details.) Google’s goals in creating the new standard are summed up in less technical language:

  • Minimal changes are required as the website grows
  • Users and search engines see the same content (no cloaking)
  • Search engines can send users directly to the AJAX URL (not to a static copy)
  • Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content

Google estimates that about 70% of all web content is created dynamically, and that figure is likely to grow. “This hurts search,” Google says. “Not solving AJAX crawlabilty holds back progress on the web.” Those quotes are from a Google Docs presentation deck about the proposal, embedded below.

Read More Articles http://searchengineland.com/google-proposes-to-make-ajax-crawlable-27408

What are the Factors of Deep Linking

Deep linking is as important a consideration as back linking! It does not matter which page visitors use to enter our websites. If they like what they read on our internal pages, they are more likely to view other pages on our websites. If they view other pages on our website, they are likely to find our homepage, and we will get a chance to tell them why they should buy our products or services.

Deep links to our website help to ensure that the search engines will have good cause to show our internal webpages as well as our homepage. For every page in our website that gets great SERP, our chances of getting a sale are increased significantly.

We have 15 pages on our website, eight of which provide real content to our prospective clients. All eight of these pages have a significant number of back links pointing to them. 48% of our visitors land on our home page. 37% of our visitors land on our internal pages. As a result, 85% of our traffíc lands on our website as a result of our back links, either directly or through our natural search placement in the search engines. The remaining 15% arrive on our website through bookmarks, personal referrals, and paid listings.

What Are Deep Links?

Deep links are links that go to specific pages within your website. For example, let’s say that you have a home improvement website that has a large number of pages and articles on it telling people how to do projects. If all of your back links are pointing only to your home page and you have none pointing to specific article pages, then you are not getting the full benefit of your linking activities.

Think about it this way, if I go to your website and find a piece of information that I find particularly helpful or interesting and I want to tell other people about it, how will I do it? When I tell all my friends on my blog about this great page of yours, am I going to link to your home page? No, I am going to copy and paste the actual webpage address out of my browser, into my blog. That is deep linking and what is considered to be natural linking by the search engines.

What Are Natural Links?

Natural links are those links that are created by people other than the website’s marketing team. Suppose I posted a link in my own blog that said that the "most easily understood tutorial, I have read, for creating a php-xml parser" was: http://www.sitepoint.com/article/php-xml-parsing-rss-1-0 , and I put my quoted text into the link. That is a natural link, because I created the link with no prompting from the management at SitePoint.com.

Difficulties In Creating Deep Links

There are a few problems that you will run into when trying to create deep links to your site. One problem is that if you ask a Webmaster of another site to link to you, they will most likely just link to your home page. When you submit to directories, the vast majority of them will only allow you a link to your home page, not a deep link. Even if they do allow you to submit a deep link, they will not allow you to submit 10 deep links.

Success Tips For Creating Deep Links

Deep linking is quite a bit easier when utilizing free reprint articles as a part of your link building campaign. This is because you can put whatever link you want to put in the "About The Author" box. The About The Author box is required to stay intact in all websites that are using your article. If you intend on writing a large number of articles to promote your domain, then you will want to optimize your results by putting a different deep link into the About The Author box for each of the articles that you write.

Another method of doing this is free and easy, but requires a bit of time. Take keywords in each page of the text on your website and make a hyperlink on that word or phrase to another page on your site. This is very easily done if you know how to do basic HTML. The ultimate goal here is to have every page of your website linked to, at least once, by another page on your site. You will want to spread these out among your domain’s webpages, instead of having just a couple of pages linking to the other 50 pages.

Social Bookmarking Secrets | Effective way of Doing Social Bookmarking

In Web.2 Concept Social Bookmarking has it's own place in the internet even the social media is getting raised. It's a process where people can find links and bookmark or favorite those links and also can vote for the links. More votes for a link means the popular bookmark it is.

Most of the guys even don't the right process that how to do social bookmarking. It's also an art, if you know that how to perform the art perfectly then your profile would be get more visits and your links would get more votes.

I would like to explain you the right way that how you need to perform social bookmarking. Step by step described follow :
  • Update entire profile will complete info which looks professional
  • Search authorized profiles then vote and comment on their links
  • Add Authorized profiles in to your friends
  • Build a professional relationship with them
  • First to vote and comment on their links
  • In result you will get good comments and votes for your links
  • Provide different kinds of good links which catches public interest.
List of Social Boookmarking sites :
  • Digg.com
  • Reddit.com
  • Hotklix.com
  • Zimbio.com
  • Wikio.com
  • Current.com
  • Shoutwire.com
  • Stumblepon.com
  • Technorati.com
  • Del.icio.us
  • Propeller.com
  • Simpy.com
  • Slashdot.org .......... etc...
Main problem here is most of the people will create profiles and submit their own website links again and again in the same profile, it causes to ban account by the site administrator and also you will get a bad impact in the internet world. Hope you guys are clear with instructions if not let me know. So that i can correct myself. Feel free to ask questions. Looking forward for your comments.

Anchor Text Optimization | On Page SEO Techniques

Getting quality keyword anchored backlinks is the name of the game in SEO.

When obtaining backlinks to your site, it is important to include you keyword as the text for the link! Google uses the anchor text to make choices about the content of your page and how to rank you for certain keywords.

5 Ways to Anchor Your Links Like a Newbie

1. Use “here” / “click here” / “my website” when you link to your site. Congrats, you just told Google your site is about “here” or “click here”; not “clocks”, “clothing”, “video games”, “credit cards”, or “loans”.

2. Using your name as your primary anchor text. You’ll rank really well for your name, but you’re going to have one hell of a time making money from that. When was the last time you saw massive demand for “Frank”?

3. Using your business name. So you’ll rank really well for your business name, but this will do nothing but provide traffic from people who already know who you are. What about bringing in new people? Enough people will link to you with your site / business name, so you’ll earn the top spot for your name eventually by default, especially if your business name is your URL.

4. Anchoring your online nickname. This is even worse than your name. Who is ever going to search for CoolDuDz1978 or something crazy like that?

5. Anchoring the same thing over and over. This is the next step newbies take. You tell them “anchor your keyword” and they go get 1000 exact keyword anchored links over night. Google bomb much? Google will see this as spam and could actually hurt you. Mix it up.

5 Ways To Anchoring Links a Pro

1. Anchor your keyword, related keywords, and long tail keywords.
2. Anchor deep pages with keywords.
3. Develop a “keyword” nick name. *cough* *cough* SEO Tips & Tricks / Basic seo Factors *cough* *cough*
4. Develop a keyword sitename. Court’s “Internet Marketing” School. <– f’ing great.
5. Get creative with your anchor text. My friend’s company he works for is “Paramore Redd”. They always anchor Designed by: Parmore Redd. Why not anchor something like “P|R Nashville Online Marketing”?
6. Use your keyword in your site title, like “How to Make Money Online“, that way people will use the keyword as the anchor text when they refer to you.

Looking at someone’s anchor text really shows how much they know about SEO. And by links, I’m talking about links pointing to your site from other sites organically. Don’t get this confused with the type of linking Court is talking about in his post How To Make Money Posting Links On Google. Those type of links aren’t real links at all, but are contextual advertisements. Those “google links” are actually Adsense ads. But anways, the role anchor text plays in your SERPS success is HUGE. Do not get caught anchoring like a newbie. Put some thought in creative ways to get the anchors you want. Of course, sometimes you have to go with the site name or your name, but try to get as many keyword anchored links as you can.

Google Chrome 3 is Released | Download Google Chrome 3

Two weeks ago Google has been launched Google Chrome 2 and now Google Chrome 3. Yes, it seems that Google is in hurry to compete in versions with other Internet browsers applications. There is no so much difference in Chrome 3 as compare to the last released version.

Published Chromium developers release notes for Chrome 3 describes 16 features which are fixed in Chrome 3 that means there are no enhancements in browser. Some of fixes are multiple crash fixes, book marks extension, tab enhancements, localhost problems fixings etc. Google Chrome 3 is just released for MS Windows PCs. As there is no prominent distinctive line in functions of Chrome 2 and Chrome 3, so read Chrome 3 release notes before moving for new version.












A feature you won't probably use too often, at least for now, is the support for the HTML5 video and audio tags. Like Firefox 3.5, Chrome includes video codecs that allow you to embed videos without using slow and unreliable plug-ins like Adobe Flash. You can test this feature in TinyVid.com, an experimental Ogg video uploading site, or in YouTube's HTML5 demo page, which uses an H.264 video.

One year after the first release, the numbers are impressive: "51 developer, 21 beta and 15 stable updates and 3,505 bugfixes". Google Chrome's market share is 2.84%, according to Net Applications, but the browser's impact was even more significant: Chrome set a high standard for browsers by focusing on speed, a simplified user interface and by handling web pages as if they were applications. Safari 4, as well as the the next versions of Firefox, are influenced by Google Chrome's simplicity.

In other Chrome news, the documentation for creating extensions is now available and the support for extensions is enabled by default in the dev channel. If you use the stable version of Chrome, you need to wait a little bit.

Download Google Chrome 3

Sitemap Autodiscovery | Submit XML Sitemaps with Robots.txt

Search engines and robots.txt:

Since the birth of internet search engines, the robots.txt file has been how webmasters could let search engines like Google know what content should get crawled and indexed. However, as part of Google Sitemaps, later named XML Sitemaps Protocol, the usage was expanded with Sitemaps Autodiscovery. It is now possible for webmaster to direct search engines to the website XML sitemap. The moment a search engine has found your website and the robots.txt file, it will also know where to find your XML sitemap.

Submit your XML sitemaps:
In the beginning XML sitemaps submission required you had created and verified a Google Webmaster Tools account. You also had to submit your sitemap files manually. Now, instead of submitting XML sitemaps to all search engines individually, you can be done with them all in seconds.









As see from above, to add XML sitemaps autodiscovery to a robots.txt file, add the fully qualified XML sitemap file path like this: Sitemap: http://www.example.com/sitemap.xml.

Complete robots.txt example for XML sitemaps autodiscovery
User-agent: *
Disallow:
Sitemap: http://www.example.com/sitemap.xml

If you have created a sitemap index file, you can also reference that:
User-agent: *
Disallow:
Sitemap: Sitemap: http://www.example.com/sitemap-index.xml

Manual XML sitemap submission
There can be good reasons to submit your sitemaps manually the first time, e.g. to get acquainted with the different search engine and webmaster tools available:

* Google Webmaster Tools
* Live Webmaster Tools (MSN)
* Yahoo SiteExplorer

Advanced manage and submit sitemaps
In the beginning no search engines supported cross submit multiple websites in one XML sitemap file. However, now most include support for new ways of managing sitemaps across multiple sites. Requirement is you need to verify ownership of all websites:

* Sitemaps protocol: Cross sitemaps submit and manage using robots.txt.
* Google: More website verification methods than sitemaps protocol defines.

Read More Articles Visit http://www.xml-sitemaps-generator.com

Google Introduced | Recent Google Search Results

Google Introduced New Factors, Find your Indexd pages in google with in second, minutes, hour etc, Example: Indexed Pages within, 46 seconds, Minutes, Month, Week, Hour these are the Factors, google introduced please read this story

Ran Geva noticed that Google's date range restrictions have been extended and you can now find web pages indexed by Google less than one minute ago or even less than 10 seconds ago.

(Update. Google doesn't necessarily index web pages as soon as they're published, but the sites that use feeds or sitemaps are indexed pretty fast. With recent advancements like PubSubHubbub that provide real-time notifications for updates, the
delay between publishing pages and finding them using Google will be further reduced.)

Click on "show options", select "past 24 hours" and tweak the URL by replacing "tbs=qdr:d" with "tbs=qdr:n" to find pages indexed in the past minute.














The date restriction feature is quite flexible, but you need to know the syntax used by Google's URLs:

tbs=qdr:[name][value]

where [name] can be one of these values: s (second), n (minute), h (hour), d (day), w (week), m (month), y (year), while [value] is a number.

To find the web pages indexed less than 45 seconds ago that include the word "flu", use this URL:

http://www.google.com/search?q=flu&tbs=qdr:s45

Unfortunately, if you restrict the results to very recent web pages, Google shows a small sample and doesn't list all the results.

Example: a search for [Tiger Woods] restricted to almost real-time results.

2009 New Google Updates| Google Grows a Larger Search Box on Home Page

Google Introduced the New Home page for Keyword Suggestion, we see clearly now, the keyword what ever Search, that is. For us, search has always been our focus. And, starting today, you'll notice on our homepage and on our search results pages, our search box is growing in size.

Although this is a very simple idea and an even simpler change, we're excited about it — because it symbolizes our focus on search and because it makes our clean, minimalist homepage even easier and more fun to use.

The new, larger Google search box features larger text when you type so you can see your query more clearly. It also uses a larger text size for the suggestions below the search box, making it easier to select one of the possible refinements. Over the past 11 years,

we've made a number of changes to our homepage. Some are small and some are large. In this case, it's a small change that makes search more prominent.
See this Old Google Home Page

Google has always been first and foremost about search, and we're committed to building and powering the best search on the web — now available through a supersized search box.

Read More Articles Visit http://googleblog.blogspot.com/

Google Caffeine New Search Index & Algorithm | Google Answer to Bing

Microsoft launched its new search engine Bing in June, positioning it as a “decision engine” which delivers more than just page results. They also parnered with Yahoo recently gaining license to Yahoo’s search technology. This will allow Microsoft to extend its new search engine Bing to all of Yahoo’s Web properties. Microsoft has already started to steal some search engine market share from Google. Facebook is also giving competition to Google by releasing a new realtime search engine.

Google is, however. not taking these developments lightly. For last several months, Google has been secretly working on a new project - codenamed Caffeine - the next generation of Google Search engine. This is not going to be a minor upgrade, but an entire new architecture for Google search engine. In short, it will be a completely new version of Google.


The next generation search architecture dubbed Caffeine - is still under development, but Google has released a beta version of Caffeine for public. Though very little is known about Caffeine’s algorithms and inner workings, its launch has come at the right time when by Microsoft is making an all out effort to grab a share in the search engine marketplace, where it is currently in the third position.

As per statement released by Google, the objective for the new version of Google Search is to improve its size, indexing speed, accuracy, and comprehensiveness. There is no confirmation from Google about when it would formally launch the new Caffeine architecture.

We tested Caffeine for few keyword phrases, and found out that it is much more faster, and has more indexed pages. Search results are obviously little different than the ones you get in the current version of Google. Why don’t you take a test drive yourself to see how the new Google search engine works?

2009 SEO New Updates | Google Changed New Tactics for SEO

Some of the SEO changes that took place in 2009 are :

Google improves flash indexing :

Google has enhanced its search engine’s capacity to index Adobe’s Flash files, which are very popular on the Web but tricky for search engine spiders. Google and Adobe’s created a new algorithm which now indexes text content in Flash. As a result Google Bots now indexes textual content in .SWF files of all kinds and extracts URL’s embedded in Flash.

Canonical Link Element:

In short, the canonical element is a line of code that you add to pages that may be duplicates. The canonical link element rel=”canonical” is added to a page within the header () of a page, typically by developers, as follows

In this code, you designate the “canonical,” or “proper,” URL. Engines, in turn, note this URL and apply link popularity and authority to the canonical version instead of applying them to duplicate URLs.

This provides a hint to a search engine about which is the most important or original page if pages are identical or similar in terms of content, but have a different URL.


Two major updates were :

i) Regular PR and Backlink Updates :

Usually a Google PageRank update occurs every three months. So everyone was expecting an Page rank update in the month of march. Lots of rumors and gossips were spread out among webmasters and finally the wait ended up in the first week of April. This time Google crawled for both Page rank and Backlink. But this year Google gave two more surprises as Google danced similarly in the month of June and July which made a great impact on all the working strategies of SEO.

ii) Search Refinements and Snippets :

There were three changes on the Google search result pages which were visible to all the users.

1. The snippet length for the search results has increased. Earlier the snippet was of two lines. Google now displays an extended snippet of 3 lines for queries that consist of three or more keywords. The idea behind this change is that these multi keyword queries are much targeted and complex and hence the short snippet might not contain enough information.

2. Another change is that Google now shows more related searches at the bottom of the search results page. It is very important that you optimize the different pages of your website for different keywords.

3. The third change is that Google now shows local results based on IP addresses. The local results are delivered based on the IP address of the searcher. That means that you will get different results than people in another city.


i) Google Show Options

Google announced a new set of features called as Search/Show Options, which are a collection of tools that rearranges the result so that it can be viewed from different user’s perspectives.


ii) Microsoft New Search Engine BING Launched

On May 28th Microsoft publicly unveiled its soon-to-launch search engine Bing, which was publicly available on June 3. Bing is Microsoft’s new decision engine. The home page features a rotation of stunning photography, for instance, which can be clicked on to produce related image search results. In search presentation, Bing wins. It uses technology from Powerset (a search technology company Microsoft acquired) to display refined versions of your query down the left side of the page. Bing gave a menu of “related searches,” that included Walkthrough, News, and so on.

No More Page Rank Sculpting:

PageRank sculpting is an SEO tactic that involves adding the nofollow attribute to links for which PageRank flow is not necessary. Early in 2005, the main search engines introduced a ‘nofollow’ attribute for the A HREF HTML command for coding hyperlinks.
Pagerank sculpting used to be really effective around a year ago but Finally Matt Cutts announced in his blog that it’s not the most effective way to utilize your page rank and its not recommended.

Webmaster “Summer Shine”

July 2009 Google webmaster team have named there new update as “summer shine”. Webmaster added new feel to the webmaster. Among which some of the major updates are:
1.Site Selector Update
2.Site Link Update
3.URL Remove Request update
4.Home Page Update

Next Generation Architecture for web Search - Google’s Caffeine :

On 10th August Google ask people to help and test there next generation infrastructure. This change was made to improve indexing capability, speed and accuracy of the search results.

Google has provided a preview of the new infrastructure at www2.sandbox.google.com and invited web developers to try searches there.

Avoid 5 Black Hat SEO Techniques

Whether you think Black Hat SEO is bad or not you should avoid it anway, because it can get you banned from the search engines, or at least reduce your ranking. Google has been known to remove sites it felt weren't playing fair. Granted, this isn't likely, but why take that risk? Also, much Black Hat SEO involves some fairly technical work. If this article is your introduction to SEO, you likely don't have the skills to be a successful Black Hatter anyway -- at least one who doesn't get caught.

If you want to stay on Google's good side, here are some things to avoid:

* Invisible text : Don't put white text on a white background. In fact, don't put even very light yellow on a white background. The engines aren't stupid; just because the colors aren't exactly the same doesn't mean they can't figure out there's no contrast. Yes, there are clever ways to try to fool Google about what the background color actually is, but Google is probably aware of most of them anyway, and I won't cover them besides.

* Cloaking : Google knows what's on your site because periodically its automated robot called Googlebot visits all the pages in its index, and grabs all the page content so it can analyze it later. Cloaking means showing one page to Googlebot and a completely different page to real human visitors. Google despises this aplenty.

* Keyword Stuffing : The engines want your pages to be natural. Finding every place to cram your keywords onto your pages -- or worse, including a "paragraph" of nothing but keywords, especially if they're repeated ad nauseum -- is a big no-no. Do you consider pages with lists of keywords to be high quality? Neither does Google.

* Doorway pages : A doorway page is a page built specifically for the purpose of ranking well in the search engines and without any real content of its own, and which then links to the "real" destination page, or automatically redirects there. Doorway pages are a popular choice of some SEO firms, although Google has cracked down on this and many webmasters saw their pages disappear from the index. Some SEO firms call their doorway pages something else, in an effort to fool potential customers who know enough to know that they should avoid doorway pages. But a doorway page is still a doorway page even if you call it something else. Some engines may decide that an orphaned page is a doorway page, and if so then the page or the site might suffer a penalty.

* Spam : Spam has a special meaning with regards to SEO: worthless pages with no content, created specifically for the purpose of ranking well in the engines. You think they have what you're looking for, but when you get there it's just a bunch of ads or listings of other sites. The webmaster is either getting paid by the advertisers, or the page is a doorway page, with the webmaster hoping that you'll click over to the page s/he really wants you to go to.


Read More Articles Visit http://websitehelpers.com/seo/blackhat.html

15 Common SEO Mistakes

1) Focusing on wrong keywords for the site
Now this is the major problem for the sites not ranking well. You have a site for only providing the services for manufacturing of business cards, but you have all the keywords like printing cards, delivering cards etc, that doesn't give any value to your website, rather than focusing on too many non related business keywords or not to the specific keyword.

2) Leaving the Title Tags Empty
I have seen the websites not giving focus to the title tags of the web-page. Title tag is an important tag and it is being displayed in the search results as the title of the web searches for the website. So as a result it becomes important for the titles of the web pages to be unique and present on the page.

3) Not Utilizing the Meta Data
Meta Data, i.e., The Meta Title, The Meta Description & The Meta Keywords, even though they are becoming the things of the past, but having them in your web page not only increases the relevancy of the page but also helps in creating/identifying the theme of the web-page to the spider.

4) Concentrating only on Meta Tags
Now this is a another seo mistake which comes into picture. If you are thinking that providing only the proper meta data to your web-page can put you on the top of the search engine listings, then my dear friend(s) think about it.

5) Java Script Menus
Java Script make the menus very functional, but is not considered as SEO friendly. There are some sites which can't live without the Java Script, so in those cases where you cannot remove the java script menus by css, it's better to provide the menu links in the sitemap or you can also create the footer links for the menus.

6) Lack of Maintenance
SEO is an ongoing process, so once your business start showing up for your keywords which you have optimized the site for, don't just stop. Think of the other ways and methods and processes, look at your competitors and see what they are doing, and create new strategies to maintain your listing on the top.

7) Using Images for Headings
Images do make a site look more attractive and appealing to the users, but using the images in the place of Headings is really a bad idea as far as SEO point of view is concerned.

8) Not choosing proper URLs
Now this may seem too basic thing to be considered, but choosing the proper keyword for your domain is very important factor in SEO. Not only this, while creating sub-pages for the website, it's always better to put keywords in the URL of the pages. All the major three search engines give proper relevance, so there is no point of ignoring the keyword stuffed URL's.

9) More Backlinks are never better
This is another SEO Mistake which many people considers. Having backlinks is important for the SEO, but the more the better concepts does not applies here, rather than the quality matters. So just check the quality of the links which your website is getting, other than checking the number.

10) Lack of Keywords in the Content
You have the content for your web-page, but you lack the keywords in the content for your page, that will do no good to the site. Putting important keywords on the page, particularly the one you are optimizing the web-page for, is very important and is highly recommended as SEO point of view.

11) Missing Robots Tags
Robots tags are as important as title tags. They provide the information to the spider to crawl which section of the website, and to leave which portion of the website.

12) Directory Listing in DMOZ & Yahoo
Google takes it's search results from DMOZ and Yahoo takes from Yahoo Directory. As these are the major search engines, it is always better to get listed in these directories to get major exposure. Even though there is a price of $299 to get listed in the Yahoo Directory, as compared to DMOZ which is free, it's worth submitting it.

13) Improper Anchor Text
It's a good way to use the keywords in the anchor texts and a way to circulate them in the page.

14) Insufficient Content
The content of the site has to be informative and should provide enough information to the user. No content means no traffic.

15) Not Following the Guidelines as Laid by Google, Yahoo & MSN
The major search engines have laid the guidelines for the better indexing and crawling of the websites, not following them leads to the negative result for the websites, check the same here,

Get More Article Visit http://tusharvickkie.blogspot.com

What is Keyword Cannibalization ?

Keyword Cannibalization

Keyword Cannibalization is a process where the web pages of a website compete for each other, for the same keywords or the key phrases. This creates a problem of confusion for the search engine spiders to provide which page in the search results, to the user.

In simple terms it's like repeating the same keywords in the website, to gain advantage in the search results. But it's reverse happens as the spiders cannot decide to show which page as the search result to the user, as everything in front of them seems to be the same. Maximum times this happens by mistake or in order to repeat the keyword throughout the site.

The story behind Keyword Cannibalization

Search Engines consider both the "On Page factors" and the "Off Page Factors" to determine the rank of the web page. On Page factors like the Title Tags and the Content, and Off Page Factors like the number of Links to the web page, the Anchor Text to those links and the Authority of the linking sites, play a vital role in the rank of the web page.

But if you are using the same keywords for multiple pages, then it becomes difficult for the search engines to decide which page to show for the respective keywords typed in the search browser. Moreover, search engines generally don't show multiple pages from the same website, as its not useful for the user. So they end up showing the web page which they feel is the best, and hence Keyword Cannibalization.


The Solution to Keyword Cannibalization

  • If you already have the keyword cannibalization in your website, then it's better to use 301 redirects, from the different pages to the main page.
  • Use unique Title tags for each page.
  • Think every page as a different page and write unique content which appeals to the users.

10 Factors to Promote your Business | Web 2.0 Techniques

Basic Factors Web 2.0 Techniques

Web 2.0 is a very effective platform, which enables users to create, modify and share the information via the world wide web. Web 2.0 or Social Media has lot of significance and can be used to promote your business very effectively.

1. Social Bookmarking - Starting from DIGGING your content to the the social networking websites like Digg, and Technorati which provides an excellent way to distribute your content on the world wide web. This provides a terrific way to promote your website.

2. Content Syndication - Create an informative article, and then distribute it via RSS. It provides an effective way to attract visitors every time you syndicate it.

3. Article Creation - Create informative article and then distribute them to the websites which accepts articles.

4. Press Release Submission - Submitting Press Releases to the websites allows you to promote your business, get more visibility, more traffic and backlink. There are a lot of press release websites which publish the press releases of no charge.

5. Classifieds - Use free classifieds where you can submit your URL. Here is the list of the same,

* Craigslist
* Google Base
* Yahoo Classifieds
* Classifieds for Free
* Text Link Exchange
* Recycler.com
* US Free Ads
* Kijiji


6. Join Social Communities - Social Communities like MySpace, FaceBook etc provides a lot of exposure to your business. Create a social community by joining any of the social site, and make friends i.e., your target audience.

7. Take Part in Yahoo Answers - Ask questions or answer to the questions based on the relevant specific category of your business, in Yahoo Answers. Provide a link to your website either in your profile or in the answers.

8. Forum Posting - Search for relevant forums for your business. Sign Up and then start taking part in the discussions or answer to the questions already on the forum. Make your signature, and then place a link to your website into it.

9. Blog Posting - Creating a blog for your corporate website, provides a tremendous exposure to your business. Moreover, find relevant blogs to your website, and comment on them.

10. An Aricle in Wikipedia - Create an article in wikipedia about your business.


Read More Articles Visit http://tusharvickkie.blogspot.com

SEO Tips & Factors | How to Create Sitemap for your Website

How to Create Sitemap for your Website

There are many third party software and tools available on the internet for creating the sitemaps. Even I do use the tools when I am short of time. But it's always better to learn to know how to create the one manually. Here I am listing the procedure for creating the sitemap for your website.

Basically there are 3-steps for the sitemap,

1. Creating the Sitemap, in .XML file extension
2. Saving the Sitemap in the root directory of the server, and
3. Submitting the Sitemap to the search engine

1) Creating the Sitemap

The sitemap creation is very simple, just need to follow the following steps.

* Open any text editor, like notepad etc, and then save the file as - sitemap.xml

*

* Just copy paste the template as shown in the figure above. Now change the URL to your business domain or website.

* If your site consists of more than 4 pages, then just change copy the URL section of the site map and change the URL's accordingly.

* There can be more attributes for the sitemap, other than the URL, like

1. Changing frequency
2. Last Modified
3. Priority

You can add these details as per your need. For more details go to the suggested reading section down in the post.


2) Saving the Sitemap in the root directory of the website
Save the XML Sitemap in the root directory of the website.


3) Submitting the Sitemap to the Search Engines
This is the third step of the Sitemap creation and is very important. Submit the sitemap to the search engines, and by that I mean to submit it to the major 3 - Search Engines, i.e., the Google, Yahoo and the MSN.

* GOOGLE - Create the webmaster's account, if you done have, or-else sign in to the Google Webmaster Accounts. Submit the Sitemap where it says for Sitemap submission.

* YAHOO - Create an account in Yahoo, or sign in for the same. Add the sitemap by clicking to Submit Site Feed

* MSN - In order to submit the sitemap to MSN, sign in to your MSN's account by going to Webmaster Tools or create one if don't have. Go the section where it says add sitemap, so add it there. And hence you are done.


Read More Articles Visit http://tusharvickkie.blogspot.com

Google Search Engine | Crawling vs Indexing Factors

Crawling vs Indexing Factors

We do a lot of searches monthly, and we do that by putting the search term into the Google search box, and just on the press of the button we are displayed by a zillion search results based on the popularity of the searched term.

Have we ever thought how it happens. Just by entering the search terms we are displayed the highly accurate information. I have tried to mention the concept behind the Google Search Results, by comparing the Crawling and Indexing process of the Google Search Engine Bot.


There are basically three processes to present the search results to the visitors

* Crawling the Website
* Indexing the Website
* Serving the Results


Website Crawling

Crawling in simple terms is the process of finding & adding the new and updated pages to the Google Index. Crawling is done by the Google's software, called as Google Bot (which is also known as the search engine spider or robot, or a bot). The Google's algorithm for crawling the site determines which sites to crawl, how many times, and how many pages from each site.

The crawler process starts with the list of the URL's, generated from the previous crawl and from the Google Sitemaps. The Googlebot visits each of these websites and it detects links on each page and adds them to its list of pages to crawl. The new sites, changes to existing sites, and dead links are noted and used to update the Google index.

It's an automated process, the crawling of the website, and has nothing to do with the payment to crawl the website more frequently.


Website Indexing

Indexing in simpler terms is mintaining list of things. In this aspect indexing means, compiling an index of words, and their location on each page, taken at the time of crawling the website by Google Bot. In addition to this, the Google Bot processes the information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many content types, but not all.


The Search Results
Serving Results is the process where the Google returns the results to the query entered by the user, based on the matches with the indexed pages. The relevancy of the search result depends on a lot of factors. And based on those SEO factors Google returns the search results.

In order for your site to rank well in search results pages, it's important to make sure that Google can crawl and index your site correctly.


Read More Articles Visit http://tusharvickkie.blogspot.com

Content Optimization Factors

Content Optimization Factors: (optimization factors for every page)

• Page Title: use your keywords on page title.

• Use h1 header tags: put your topic in h1 header tag.

• Keywords Meta Tags

• Description Meta tag Keywords

• Add image tags: use Alternative Image Text.

• Use h2 header tags for sub heading.

• Keyword Density: The percentage of times a keyword appears on your page, compared to the total number of words on the page.

On page SEO Factors

On page Optimization, SEO Factors

Keyword in URL
Keyword In Title Tag
Always start with keyword selection, research and testing
Title tags
Meta tags
ALT tags
H1 tags
URL structure
Internal Linking
Content
Keyword density
Site maps
Usability
Track target keywords

Don't Do common on-page SEO mistakes

Don't Do common on-page SEO mistakes

These are the Factors Dont do SEO Tactics

Duplicate content
URL variants on the same pages
Off-site images and content on-site
Duplicate title tags

Don't do SEO spamming tactics such as:

Hidden text
Hidden links
Keyword repetition
Doorway pages
Mirror pages
Cloaking

Off-Page Search Engine Optimization Factors

Off-Page Search Engine Optimization Factors

1). Community Creation in Social Networking Sites
2). Blogging
3). Forum Postings
4). Search Engine Submission
5). Directory Submission
6). Social Bookmarking
6). Link Exchange
7). Link Baiting
8). Cross-Linking
9). Photo Sharing
10). Video Promotions
11). Business Reviews
12). Local Listings & Yellow Pages
13). Article Submission
14). Press Release Promotion
15). Classifieds Submission
16). Social Shopping Network
17). Answers
18). Document Sharing
19) CSS, W3C & RSS Directories Submission
20). Widget / Gadget Development
21). PPC Ad Campaign
22). Press Realease

Search Engine Optimization Vs Search Engine Crawlers

There is a lot to apprentice if it comes to online or internet business and seek engine optimization. That is why a lot of humans appoint SEO consultants to anguish about their internet business for them.

However, if you do not wish to accept to pay SEO experts to advice you with the business of your website, again you may wish to anticipate about accomplishing it yourself! It's not that hard, it just takes some account on the internet from trusted sources. Really, a lot of of the SEO casework that are provided online are congenital about one thing, and that is seek engine crawlers.

Today we are traveling to allocution added about seek engine crawlers, and what they do. Soon you will acquisition out that appealing abundant aggregate that you do, as far as seek engine enhancement goes, is to accomplish these little crawlers happy. It is no best as simple as a quick seek engine acquiescence and your website becomes popular.

As you may be able to assumption by the name, these seek engine crawlers are little bots that roam about the internet searching for new websites and baronial them based on an algorithm. If they appear beyond a new website, they rank it in altered ways, demography abounding things in to consideration. These ranks advice to actuate area the website is traveling to end up in a seek done on a seek engine.

The aboriginal affair that you accept to do for these crawlers is to get noticed by them, and that is the hardest allotment of SEO, or at atomic that is what SEO consultants wish you to think! The accuracy of the amount is, you can get your website noticed by artlessly visiting altered accepted seek engines and abacus your URL to their acquiescence forms.

Top 10 SEO Questions | Basic Search Enigne Optmization | Learn SEO Techniques

SEO questions.

1. What is SEO?

SEO stands for seek engine optimization. A seek engine is a apparatus abounding internet users use to acquisition sites that are accordant to their needs. The three biggies if it comes to seek engines are Google, Yahoo and MSN. There are however, hundreds of seek engines accessible to internet users. Seek engines plan by sending out spiders to clamber through the World Wide Web and accumulate information. If you acquire the advice they're analytic for, in the places they are looking, they'll acquisition you and abode you in their after-effects if a being is analytic for your information.

The assignment of compassionate what seek engines are analytic for and putting it in the appropriate places on your website and in your content, is the aspect of seek engine optimization. So now you ability be asking…what do seek engines attending for and area do they attending for it? The acknowledgment is keywords and links. Keywords in your html coding, keywords on your webpage content, keywords in your content, and the amount of admission links you acquire to your website.

2. How important is SEO?

Let's just put it this way. What's bigger a few visitors who blunder aloft your website or hundreds of visitors that go to your website with the absolute ambition of acquirements added or authoritative a purchase?

With added and added humans analytic and arcade online, accepting on the aboriginal page or two of the seek engine after-effects can beggarly the aberration amid befitting your day job and acceptable an internet millionaire.

3. What are argument links?

Links are just one of the accoutrement you can use to access your seek engine optimization. The added superior links you have, the bigger your seek engine baronial will be. Argument links are links that accommodate alone text. Wikipedia is a abundant abode to appraise centralized argument links. The links are independent aural a book and if a clairvoyant clicks on them they are taken to a altered page on the aforementioned website. The affectionate of argument links you're analytic for will be argument links that will yield readers from your article, ebook, or web archetype to your website.

4. What are hotlink farms and hotlink exchanges?

Search engines don't acquire just any old link. The hotlink has to be from a accordant and superior company. This agency you don't wish to participate in hotlink farming. If a seek engine suspects your links to be lacking, they'll in fact amerce you. Hotlink agriculture or hotlink exchanging is about the action of exchanging alternate links with Web sites in adjustment to access your seek engine ranking. A hotlink acreage is a Web page that is annihilation added than a page of links to added sites. Stay abroad from hotlink farms. If you accomplish a hotlink from addition site, it had bigger be accordant and advancing from a absolute web site.

5. What is alike content?

The analogue of alike agreeable is web pages that accommodate essentially the aforementioned content. Seek engines will amerce you for this. How do you abstain alike content? Don't broadcast the aforementioned commodity in several locations. There are abounding accoutrement accessible online to advice you re-write your agreeable so that it is 30%, 40%, and even 50% different. However, the best way to abstain alike agreeable is to artlessly address new content.

6. How do I acquisition the appropriate keywords?

There are several accomplish to award the a lot of assisting keywords. The aboriginal footfall is to about do a bit of brainstorming and appear up with a account of keywords you anticipate humans will use to acquisition your products. The next footfall is to analysis accumulation and appeal for those accurate keywords. Accumulation agency how abounding added websites are application those aforementioned keywords and appeal is how abounding humans are analytic for those accurate keywords.

7. How do I optimize my web pages?

Placing your keywords in the appropriate area is a acceptable alpha to optimizing your web pages. Seek engines attending to the headings, subheadings, area name, and appellation of your website. They aswell attending in the agreeable on your page and primarily focus on the aboriginal paragraph.

Try to get a area name with your primary keyword included. If you cover your keyword in your URL it tells the seek engine spiders anon what your website is about.

Title Tag. Your appellation tag is the band of argument that appears on seek engine after-effects pages that acts as a hotlink to your site. This is a acute aspect of your webpage as it describes to your visitors what your page is about.

If you appearance your antecedent code, your appellation tag will attending something like this: <>Search Engine Enhancement Tips

Keep your appellation tags brief, descriptive, up to date, and keyword affluent will advice to access the appliance of your website in the eyes of the seek engines, as able-bodied as giving your abeyant visitors a acceptable abstraction of what they can apprehend from your site.

Meta Tags acquire absent their accent to the seek engines about it is still accessible to abode your keywords in your meta tags. In your antecedent cipher they attending something like this:

8. Do I charge to abide my website to the seek engines?

The simple acknowledgment is - no. Seek engine spiders are consistently out there accomplishing their job and accession information. Every time you amend your website, add content, or change your keywords, the seek engines abduction the advice and almanac it. However, if you wish to be listed on a directory, like the DMOZ Open agenda project, again you will charge to abide to those.

9. What are spiders?

Search engine spiders are aswell alleged web crawlers or bot. They're basically automatic programs which browse websites to accommodate advice to seek engines generally for the purpose of indexing or baronial them.

10. How does agreeable advice my SEO?

Content is one of the best accoutrement to advance your seek engine ranking. It is a abundant abode to accent keywords, animate bond to your site, and access traffic. The key to agreeable is to accomplish abiding you're alms superior agreeable and you're afterlight your website and your agreeable frequently.

Basic SEO Tips | Make Your URLs As SEO Friendly | SEO Friendly Sites

Search Engine Optimization abounding websites angle the adventitious of not getting absolutely indexed by seek spiders accordingly risking not getting ranked top abundant (if at all) in the seek engine after-effects pages (SERPs). The consistent poor about-face amount makes the website a asleep weight, demoralizes your agents and could abuse your business.

URL Rewriting

This bearings is absolutely simple to abstain by assuming some corrective operations to the site. One of these operations, URL rewriting, is advised by some rather difficult and a bit time-consuming but can be acutely able and advantageous in the continued run.

Why It Is Nice to Accept Apple-pie URL’s

There are two actual able affidavit for you to carbon your URLs, the aboriginal of which is accompanying to Seek Engine Optimization. Seek engines are abundant added at affluence with URLs that don’t accommodate continued concern strings.

A URL like http://www.example.com/4/basic.html can be indexed actual calmly admitting its activating form, http://www.example.com/cgi-bin/gen.pl?id=4&view=basic, can potentially abash seek engines and could cause them to absence important advice independent in the URL and you to absence those advancing top rankings.

With apple-pie URLs, the seek engines can analyze binder names and can authorize absolute links to keywords. Concern cord ambit abide to be an impediment in abounding seek engine’s attempts to absolutely basis sites. Several SEO professionals accede that activating (or, dirty) URLs are not actual ambrosial to web spiders, while changeless URLs accept greater afterimage in their cyberbanking eyes.

The additional able acumen for URL afterlight would be the added account for web users and “maintainability” for webmasters. Apple-pie URLs are abundant easier to remember. A approved web surfer will not bethink a URL abounding of parameters, and would acutely be beat by the abstraction of accounting the absolute URL. This is beneath decumbent to appear with apple-pie URLs. Calmly remembered URLs advice you actualize a added automatic Web website and accomplish it easier for your visitors to ahead area they can acquisition advice they need.

Webmasters tend to acquisition that advancement changeless URLs is a abundant easier assignment than alive with activating ones. Changeless URLs are added abstract, and appropriately added difficult to hack. The activating URLs are added transparent, acceptance accessible hackers to see the technology acclimated to body them and appropriately facilitating attacks.

Also, accustomed the breadth of activating URLs, it is actual accessible for webmasters to accomplish mistakes during aliment sessions, consistent in torn links. Also, if changeless URLs are used, the links to the site’s pages will still abide accurate should it be all-important to drift a website from one programming accent to addition (e.g. from Perl to Java).

Dashes vs. Underscores

Websites that still use underscores for their URLs are acceptable scarcer and scarcer. Some say that humans who still use underscores are “old school” while dashes assume be acclimated far added generally these days.

A account accompanying acumen for application dashes rather than underscores is the abolishment of the abashing created amid a amplitude and an accentuate if the URL is beheld as a link, or if press such a URL.

More to the point, the affairs that a aggregate of keywords independent in your Web website is included in the SERPs access exponentially if application dashes.

For exemple: a URL that contains “seo_techniques” will be apparent by the seek engine alone if the user searches for seo_techniques (this affectionate of seek is rarely performed); admitting searches for “seo”, “techniques”, or “seo techniques” gives your URL absolute “seo-techniques” a bigger adventitious of getting displayed on the SERPs. The birr will advice you added than you can imagine, by abundantly convalescent your afterimage on the Web.

How to Carbon URLs

The assumption of URL afterlight is in fact ambience a “system” on the host server that will acquiesce it (the server) to apperceive how to adapt the new URL format. What in fact happens if one decides to carbon the URLs of a website is alleged appearance the activating URLs with changeless ones. This agency that the URLs that ahead independent concern strings with elements such as “?”, “+”, “&”, “$”, “=”, or “%” will accommodate the added seek engine affable “/” (slash) element, presenting themselves in a simplified form.

To advice you with charwoman your URLs actuality are some afterlight accoutrement and engines, some chargeless of charge, others fee based.

Online / Accessible Source Tools

* http://www.seochat.com/seo-tools/url-rewriting/

* accessible Source URL Rewriter for .NET / IIS / ASP.NET:http://urlrewriter.net/

* accessible Source rewrite-module acquainted for ASP.NET 2.0:http://www.urlrewriting.net/en/FAQ.aspx

* mod_rewrite:http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html

This is the a lot of accepted non-fee-based afterlight engine. It is a bore from the Apache HTTP Server that allows the simple abetment of URLs. The use of this bore requires the enabling of the RewriteEngine on your Apache server. Then, carbon rules accept to be defined, (you can even set altitude for anniversary rule), appropriately acceptance the carbon requests as they appear in.

In agreement of SEO, mod_rewrite can be accessible if you accept circuitous URLs that accommodate added than 2 parameters. In added words, if one of your activating URLs is accessed, the apparatus abaft mod_rewrite will “translate” it into a shorter, friendlier, static-looking URL.

Fee-Based Tools

* ISAPI_Rewrite

The Internet Server Application Program Interface (ISAPI) is addition URL abetment engine that functions in a agnate way to Apache’s mod_rewrite, the aberration getting it is advised accurately for Microsoft’s IIS (Internet Advice Server).

* IISRewrite

IISRewrite is a bare down accomplishing of Apache’s mod_rewrite modules for IIS. It is a rule-based afterlight engine that allows a Webmaster to dispense URLs on the fly in IIS.

URL Examples

Here are some examples of how URLs can attending afore and afterwards rewriting:

Example 1:

* Activating URL: http://www.companyname.com/products/...el=y&variety=z (before rewriting)

* Changeless URL: http://www.companyname.com/x/y/z.html (after rewriting)

Example 2:

* Activating URL: http://www.example.com/cgi-bin/gen.pl?id=4&view=basic (before writing)

* Changeless URL: http://www.example.com/4/basic.html (after writing)

Conclusions

URL afterlight can put you on the appropriate clue in the chase for top amoebic rankings if accumulated with added SEO techniques. Be acquainted that rewritten (and, presumably, bigger searching and added able in agreement of seek engine ranking) URLs cannot acting or accomplish up for a ailing advised Web site.

Don’t apprehend miracles. Nevertheless, if you adjudge that your website needs a makeover and alpha afterlight your URLs, accomplish abiding that:

* You accumulate them as abbreviate as accessible (to access usability),

* You use dashes rather than underscores (to accord your website a bigger adventitious of baronial as top as accessible in the SERPs),

* You use lowercase belletrist rather than uppercase ones (to abstain those case acute situations),

* The technology you accept acclimated cannot be detected in any of your URLs (to anticipate accessible hacker attacks).

SEO Tactics | SEO Tricks and Tips | Over 2000 Visitors Daily

Here are 10 SEO approach that accept formed and are alive for me at this moment in time.

1. Superior Agreeable is and consistently will be your amount one agency for accepting top rankings and befitting them. You accept to accept seek engines are artlessly businesses that accumulation a artefact like any added company. That artefact is information. They accept to action superior after-effects to anyone application their account to break a problem, acknowledgment a catechism or to buy a product. The added relevant, the added targeted the seek band-aid they return, the college the all-embracing superior of their artefact and the added accepted their seek engine will become. Providing superior agreeable is basic for SEO success.

2. Keywords are your amount one accoutrement for accomplishing top rankings. You accept to accept keywords and how they plan on the web. You accept to apperceive how abounding searches are fabricated anniversary day for your called keywords. Sites like Wordtracker and Seobook will accord you a abecedarian amount of searches. Architecture your pages about your targeted keywords and don't overlook to do some deep-linking to these pages on your site. Find and body backlinks to these autogenous keyword pages and not just to your home page or area URL. Picking keywords with average to low antagonism has formed out able-bodied for me. So too has application the added targeted and college converting "long-tail" keywords been actual benign for me.

3. Onpage Factors and website architecture will play a above role in the spidering and indexing of your site/content. Accomplish abiding all your pages are SEO friendly, accomplish abiding all your pages can be accomplished from your homepage and no pages should be added than three levels abroad from it - befitting a sitemap advertisement of all your above pages makes the seek engines happy. Accomplish abiding you accept all your meta tags such as title, description, keywords... optimized (Title = about 65 characters, Description = about 160 characters).

Remember, your appellation and description should be keyword targeted and are the aboriginal contact/impression anyone will accept of your website - accomplish abiding you use them to draw and attract absorbed visitors to your website and content. Aswell accomplish abiding your appellation and URL are keyword akin for best effect. Accepting your above keyword in your Area Name aswell helps, application a pike | to abstracted altered elements of your appellation has helped my rankings, so too does accepting your keyword in the aboriginal and endure 25 words on your pages.

4. Google will forward you the a lot of able cartage so apply the majority of your SEO efforts on Google. Don't avoid Yahoo! or MSN but Google is baron of seek so accord it the account it deserves. With its new browser, Google's access will alone abound stronger so you accept to optimize your pages for Google. Use Google's Webmaster Accoutrement and Google Analytics to fine-tune your pages/content. I aswell use Google Alerts to accumulate up on my alcove keywords and for animadversion link-building on the anew created pages Google is indexing.

5. Hotlink Building is still the a lot of able way to addition your seek rankings. Accomplish abiding you get backlinks from accordant sites accompanying to your alcove bazaar and accomplish abiding the 'anchor text' is accompanying to your keywords but don't avoid the argument and all-embracing superior of the agreeable bond to you. The ballast argument is the underlined/clickable allocation of a link. Don't overlook bond is a two-way street, accomplish abiding you hotlink out to top quality, top ranked accordant sites in your niche.

6. Commodity Marketing is a able-bodied accustomed adjustment of accepting superior backlinks and it still works. Writing abbreviate 500 - 700 chat advisory accessible online writing with your backlinks in the ability box is still actual able for accepting targeted cartage and backlinks. Longer online writing accept aswell formed for me and I use an all-encompassing arrangement of distribution, including SubmitYourArticle, Isnare, Thephantomwriters... additional added above online sites. Don't overlook the accomplished aspect of blogging and RSS feeds in your commodity distribution. And consistently bethink you're aswell application these online writing to pre-sell your agreeable or products. Don't overlook to advantage sites like Squidoo, Hubpages... to access your rankings and traffic.

7. Onsite Cartage Hubs accept formed acutely able-bodied for me. These cartage hubs are accomplished sections of your website adherent to one sub-division of your above theme. For example, if you accept a website on Gifts, again marriage ability could be a abstracted section. This would be absolutely fleshed out with all-encompassing pages accoutrement aggregate ambidextrous with marriage ability - a independent keyword affluent allocation of your website on marriage gifts. Works agnate as a sub-domain, but I adopt application a agenda to bisect it up, such as yourdomain/wedding_gifts. (Most experts advance consistently application a abutment in your urls, but underscores accept formed accomplished for me.) Seek engines adulation these keyword/content affluent hubs but accumulate in apperception you're creating agreeable to aboriginal amuse your visitors.

8. WordPress blog software is acutely able for SEO purposes. WordPress software is simple to install on your website even if you accept no acquaintance with installing server-side scripts. Besides, seek engines adulation these awful SEO affable blogs with their able-bodied structured agreeable and keyword tagging. I accept at atomic one of these on all my sites to draw in the seek engines and get my agreeable indexed and ranked. I aswell use Blogger (owned by Google), Bloglines and added chargeless blogs to advice deliver my content.

9. Amusing Bookmark/Media Sites are acceptable actual important on the web. These cover a accomplished ambit of amusing sites like MySpace, FaceBook, Twitter... media account sites like Digg, SlashDot, Technorati... you accept to get your agreeable into this accomplished mix if you wish to yield abounding SEO advantage of Web 2.0 sites. You should be abutting these sites and application them. It's time consuming, but it will accumulate you in the beat of things. One simple affair you accept to do is to put amusing bookmark buttons on all your pages so that your visitors can calmly bookmark your agreeable for you. You can use a WordPress plug-in. I like application the simple chargeless site/service from Addthis.com which gives me a simple button to put on all my content.

10. Masterplan! Abounding webmasters and website owners overlook to advance or accept an all-embracing masterplan/strategy if it comes to SEO. You accept to accept an compassionate of what SEO is and what it can do for you and your site. Added importantly, you just don't wish SEO - you wish able SEO. In adjustment to accomplish able SEO you accept to accept three things: Relevance, Authority and Conversions.