Friday, November 7, 2008

Good Site Map Contain Some Importent point

  • The site map should be the simplest page on your web site.
  • Do not give a fancy name to the site map link such as "Web Site Tree" - keep it as "Site map", this way your visitors understand immediately what you mean.
  • You should always avoid "dynamic" site maps. Those in which the visitors have to "work" their way to get hold of information. Remember, the reason visitors comes to a site map page is because they are lost. To make them work again for something that you can display as a simple static link will just kill the purpose of having a site map.
  • If the site map is list of text links, be sure to use the TITLE attribute of the anchor tag and include keywords inside it.
  • It is a good idea to put a sentence describing the page contents below the link for that page on a site map.
  • A site map should not be the primary navigation on your web site it should complement it.
  • A link to the site map page is very important and all pages should carry this link. The site map link can be included with other links in the main menu on your web site or placed at a section on the web page from which is it clearly visible.
  • Other important aspects on a web site should complement site maps. For example, for visited links should be different color from that of non-visited links so that visitors understand which pages they have already seen and thus, save time.

Thursday, November 6, 2008

Site Map in SEO

Site Map are Easy way for webmasters to inform the search engines like Google,Yahoo,MSN etc about Webpages on their websites that are available for crawling.

A Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL so that search engines can more intelligently crawl the site.The Search engine crawlers usually discover web pages from links within the site and also other sites.the site map create the branches of all the linkpage on website.it help the search engine to crawling the webpage.use of site map crawler easily reach the every page of the website.

Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.

Site maps and site indexes fall into the same issues.Building and maintaining a site map or site index is, just like on-site Search.Site maps help users navigate through a Web site that has more than one page by showing the user a diagram of the entire site's contents. Similar to a book's table of contents, the site map makes it easier for a user to find information on a site without having to navigate through the site's many pages.

I still recommend site maps because they're the only feature that gives users a true overview of everything on a site. One could argue that a site's navigation serves the same purpose. For example, some navigation offers drop-down menus that let users see the options available in each site section. But even with these menus, users can see only one section of content at a time.

For creation of site map  online use the xml sitemap and sitemappro

Tuesday, November 4, 2008

RSS Feed: Show Different Type Of Information

1.Blogs Feed: Many blogs are catalogued in an RSS feed, with each blog entry summarized as a feed item. This makes it easy for visitors to scan blog posts for items of interest.


2.Article Feed: Articles are often placed into feeds to alert readers when new articles and content are available. The feed entry is typically an article summary or introduction. Readers can then ascertain if the article is of interest and read further.

3.Forum Feed: Many forums now have add-ons that allow participants to receive forum posts via RSS. The RSS feeds often will show the latest discussion topics; if users are interested they simply click to enter the forum to participate in the discussion. As the topic is updated they will see new entries in the RSS feed.

4.Schedule Feed: Schools, clubs and organizations will often use feeds to communicate meeting times, places and events that might be occurring. The RSS feeds are often used to publicize events, notify the community of schedule changes or meeting agendas.

5.Discounts / Specials Feed: Retail and online stores have begun using RSS feeds to deliver their latest specials and discounted offers. Some online retailers have taken this a step further, allowing users to create their own feeds based on keywords or phrases.

6.Industry-Specific RSS Feed Uses Include:Technical professionals in specific industries have also developed RSS feeds as way to market, promote or communicate within their specific industries. In many cases, this has expanded their reach and increased communication with current and prospective customers and clients.

RSS feeds can be used by realtors to communicate the time and location for open houses, announce new property listings or promote decreased mortgage rates. Content feeds can also be used by universities to communicate sports scores or event schedules. Computer service professionals can create feeds to notify clients of potential security breaches, virus risks or outbreaks. Ultimately, RSS is molded to meet the communication needs of many sectors. Consider how RSS can benefit your business and supplement your communication needs. 

Article source: http://www.rss-specifications.com/creating-rss-feeds.htm


RSS Feed::Warning and Remember

1. If you create the file using Dreamweaver or a similar tool becareful that it does not strip out tags it feels are redundant. In order to be be an RSS feed your file needs at bare minimum that tags that were discussed above, and the file will not be valid if tags are stripped out.

2.check your work! Once your file is complete and uploaded einter it into the feed validator.

Syndication / Submission:

If you've made it this far you are in good shape it is time to "syndicate" your content! Submit your RSS feed (the xml file you created) to sites just like you would submit a web page. Some of the more popular sites that accept RSS files can be found under "Post RSS Feed".

RSS Feed::How To Create

Everyday more and more websites, news services and blogs are adding RSS content. RSS is a method of syndicating content.The concept of aggregating content in one central location or repository is very appealing. Consumers have become tired of push technology, RSS allows users the flexibility to regain control of their content. RSS feed creators provide content without forcing it on consumers. In fact with RSS consumers are able to choose the content they wish to view.

RSS feeds contain what are referred to as "items". The items are usually connected in some way and contain a common theme or other similarity.

Each item contains:

* title
* description
* link

The title and description should be written to describe the content and the link should reference the webpage that contains that actual content.

Like html, the xml file uses open and close tags to designate the title, description and link. Tags are enclosed in brackets <>, like standard html and the close tag contains a forward slash /.
FeedForAll - will easily create feeds for you!

The following is what an item in a xml file looks like:

title : The Title Goes Here /title
descritpion: The description goes here /description
link :http://www.linkgoeshere.com /link

As I mentioned earlier, an RSS feeds contains items and like the tags above, an open and close tag is used to distinguish between items.

item
title:The Title Goes Here /title
descritpion: The description goes here /description
link : http://www.linkgoeshere.com /link
/item

item:
title: Another Title Goes Here /title
descritpion: Another description goes here /description
link: http://www.anotherlinkgoeshere.com /link
/item

Now an RSS Feed is a series of items, these items are chained together to create what is called a "Channel".

The Channel appears at the top of the file and tells people how the items relate to each other. Like items channels use title, description and link tags to describe its content. The open channel tag occurs before the first item and the close tag /channel: occurs after the last item.

channel:
title: The Channel Title Goes Here /title
description: The explanation of how the items are related goes here /description
link: http://www.directoryoflinksgohere /link

item:
title: The Title Goes Here /title
descritpion:The description goes here /description
link:http://www.linkgoeshere.com /link
/item

item
title: Another Title Goes Here /title
descritpion :Another description goes here /description
link:http://www.anotherlinkgoeshere.com /link
/item

/channel:

Finally you will need to designate the file by indicating it is an XML file by inserting xml and rss defining tags at the beginning and at the very end.

?xml version="1.0"?
rss version="2.0"
channel

title: The Channel Title Goes Here /title
description: The explanation of how the items are related goes here /description
link:http://www.directoryoflinksgohere /link

item:
title: The Title Goes Here /title
descritpion: The description goes here /description
link : http://www.linkgoeshere.com /link
/item

item :
title: Another Title Goes Here /title
descritpion: Another description goes here /description
link: http://www.anotherlinkgoeshere.com /link
/item

/channel
/rss

When you save the file be sure to save it as an xml file.
Article source: http://www.rss-specifications.com/creating-rss-feeds.htm

What is RSS ? Why and How used it.

RSS is a Technology or Web feed formats used for publish curretly new updated works – like blog content posing, news headlines, audio, and video in a standardized format.RSS stands for ‘Really Simple Syndication’(RSS 2.0) also ‘news feed’.The meaning of 'feed' is simply rss documents.The format of RSS is XML and file extension is .rss or.xml.


RSS is a defined standard based on XML with the specific purpose of delivering updates to web-based content. Using this standard, webmasters provide headlines and fresh content in a succinct manner. Meanwhile, consumers use RSS readers and news aggregators to collect and monitor their favorite feeds in one centralized program or location. Content viewed in the RSS reader or news aggregator is place known as an RSS feed.The RSS can be read by the software feed reader and rss reader.the logo of rss feed is orang square contain small boll blow two arch.

Thursday, October 23, 2008

Search Engine Submission Services For Website Promotion

While talking about search engine optimization, search engine submission services are also the key factor. With search engine submission a person comes to know about how to promote their site. Through search engine submission service they realize how a website needs to get recognized and gain potential in the market. So while looking for search engine submission service it is better to obtain the general package for website promotion.

One can not ignore the importance of search engine rankings in the present scenario. Search engines are used mostly by online shoppers to find the respective products and services. People who do offline shopping also do researches on internet to know about the market. So nowadays search engine rankings may be considered as the one of the most important attribute for a successful internet marketing campaign.

The online internet business can not draw the sufficient traffic without the right search engine strategies. In the rapidly changing world of internet marketing and advertising it has become difficult for a person to keep pace with the recent industry practices for professional submissions. The way your website is submitted in the search engines, it is not sure whether your website gets included in the search databases or not.

Most people try to submit it by themselves which in reality costs them thousands of dollars. So it is worth to invest by hiring the services of internet marketing experts. He is the capable person to perform professional search engine registration into the databases of top and higher ranked traffic web directories and search engines. You will soon recover the money you spend on hiring such services through increased traffic and sales in your online website internet business.

Find the professional service provider companies after arriving on the decision about which company to hire. Such professionals should be well versed with the techniques of professionally and manually link your website into the primary databases of the major search engines and directories. They submit your website information and keywords in the best search engines. Since all major search engines draw the information and results form the top directories and other search engines, your online internet business will reflect in the results from other search engine also.

So quality submission and linking services are essential to achieve online success in internet marketing. You will not only get the increased traffic but also the targeted traffic to your website. Even the quality of traffic will also increase which in turn will return increased sales for you.

Thus it could save your precious time struggling with the search engines and directories. You can invest your time in other business activities like providing customer support, innovation of new products and services, involvement of new ideas and enhancing your business operations. You can realize your online business goals and objectives by concentrating on these aspects of your business and leaving the work of search engine submissions to be done by professionals and experts.

SEO Services::Necessary For website Promotion

A website promotion includes these activicty

Directory submissions,
Link building,
Onpage Optimization,
Offpage Optimization,
Content writing and keywords Selection.

To achieve high search engine rankings and checking visitor stats.

Your website is a vital part of your business plan. It provides the first online view of your business to the visitors. It becomes essential to update it regularly so that visitors keep on revisiting your website and doing business with you. This has created the need of websites to be designed according to the basic SEO Guidelines. So the growing number of companies nowadays are designing and developing websites according to the SEO guidelines described by Google and other major search engines. Without SEO Services it’s sure that you are missing important web promotion aspects existing in the present scenario of online marketing.

Nowadays SEO stream is catching the attention of everyone on the web so more people are keen to learn about SEO. If your business has an online presence but does not have high ranking then in major search engines then it seems like lost in the crowd. As majority of traffic comes from search engines, SEO services are required to make your websites search engine friendly and place them in the top of rankings. So Online industry is catching the attention of all, it has become essential for businesses to create their online identities. There are million of Internet users which indicates that huge amount of information is being done on the internet. You need an expert to find the right kind of information for you.

SEO provides the services of on page optimization, one way link building, directory submissions, article submissions, content writing and press release etc. These are termed as organic aspects of SEO and you can expect search engine rankings. The main aim is to get higher site traffic, more visibility over competitors, and increase in sales and conversions, brand recognition, better performance feedback and less promotion costs.

Another aspect of SEO Services is that these are cost effective as compared to the other marketing services in long run. They prove to be very effective in getting you the results and meeting the targets which you set. SEO gets you fast results as compared with traditional type of advertisements. In time and costs aspects also SEO rankings has more advantages than emails, banners and pay per click ads etc. SEO services also includes different types of services like keyword density help for your web pages, search engine submission, search engine optimization targeting certain areas of world. These consists of writing of keyword rich articles for you and provide you with a list of new keywords that can be added to your website pages to make it more search engine friendly.

Nowadays SEO also involves the aspects of site design and content which requires dedicated research, nice communication skills and other technical aspects. Just specifying the list of keywords for website’s Meta tags is not sufficient in the present scenario. As many factors affect the search engine relevance to it is necessary to carry on the task according to search engine behaviors. Even file naming conventions and site navigation aspects are framed keeping in view the search engine rules.

SEO Strategy :Point To Point

Elements of SEO Strategy

# Book a relevant domain name for your website

# Define a suitable navigation structure

# Keyword analysis to search the best keywords that matches with the theme of your website

# Optimization of your website with the use of right keywords

# Develop a content stragegy – Its rightly said that content is the king. You can develop content by writing articles

# Submission of web site in search engines

# Directory submissions

# Linking Strategy - Reciprocal and one way link building campaigns

# Develop social media optmisation stragegy - News releases, Forum postings and Article writing and submission

# Use of natual and white hat SEO strategies to promote your website

SEO Strategy : Achieve Web Promotion Goals

you should have knowledge of right SEO strategy before starting online business. The right SEO strategy which can assist you in achieving your online business goals and objectives. Follow the SEO tips carefully.


When you have an online business,The basic idea of SEO strategy is to have a web presence among all users in World Wide Web to have access to the information you are presenting. The other theme is to provide them with the products and services you are providing online. Thus this seems to be a way for you to get noticed on the web. There are some useful tips for framing a successful SEO strategy.

First of all determine the proper keywords for your website. Selection of wrong keywords will lead you to nowhere. You can target the right online customers and visitors on the basis of right keywords for your products and services. Define those keywords in you website which are used mostly by the online visitors. Take complete analysis of keywords by looking at competition and the visitor trends of searching information on the web.

Pick the right domain name. Selection of domain should be on the basis of theme of your website. It would be better option if your domain name matches with the keyword. It is possible to find the alternate keywords on the basis of these. Keyword rich domain gets familiar very soon in the World Wide Web. Due to its easy recognition they are searched commonly by people and you need to put less promotion efforts to get high search engine rankings.

It is heard that meta tags have lost relevance but still you can not ignore it. Define short titles in your website. The title should be informative and should not contain the entire keywords you can include the major keyword definite Meta tags. Include keywords in the descriptive section of the Meta tags.

Content is the major key of any website. Place rephrases in the first several lines of txt.

Define your content on the basis of these keywords, domain and Meta tags. Always write fresh content for your webpage and keep on changing it from time to time. Search engine loves the fresh content.

Keep the coding of your site neat and clean and according to the w3c standards. Search engine spiders do not like the junk code. Design the website using latest CSS techniques.

Create sitemaps in your website and proper navigation structures.

Submit your website in directories and search engines.

Write articles and press releases for your website and submit it at regular intervals.

Use the linking strategies, reciprocal and one way linking campaigns for your website

Develop social media optimization by the way of news releases, forum postings and bogs etc. spread your message by the way of viral marketing.

Use white hat SEO techniques to promote your website.

Thus you should frame a solid SEO strategy to make your presence felt in World Wide Web. Take your time to pick up the right domain, create a site and assemble its pages it with right keywords with proper navigation structure and place solid content in it.

Wednesday, October 22, 2008

Importent Activities A SEO Specialist Performs.

A SEO specialist perform there SEO activity looks after your website.

Firstly he analyze your website from search engine point of view. The coding part of website he is only concerned with Meta tags, title, description and alt image tags which is Show on his website pages.Then he pays attention to the rest of the site.compares text and tags.the text and tags are meet same your website themes .you needs to focus them. Then he revises the code, checks all the keywords, replace the keywords if there are substitutes for it. Replaces those keywords in the text and places alt keywords in images and then uploads the final layout. Submit the site to search engines and directories.

Search engine optimization specialist can assist you in a right way if you are able to explain to him your exact requirements. You can sit with him and explain what you want your site to look like, what content you require on website and which keywords best describes your products or services. Before hiring his services, take care that he is capable person to handle your search engine tasks efficiently.

SEO specialist performs following tasks-

- Use css and search engine friendly code for design.
- define those keywords for your website that meet your business requirements.
- Only valuable content is placed in your website that describes your product
- submit the site in search engines and directories.
- bidding on keywords if your company is into pay per click management.
- analyze the client web promotion requirements for a website.
- preparation of website reports like visitors status, link accrued report, page rank for the clients.

A SEO also sees that good and informative content is placed in the website. The content must provide the proper description of products and services and has the ability to attract visitors to the website and turn those visitors to prospective buyers. You are able to get the right online business only when you are able to describe properly about your products and services to the clients. A SEO specialist can help you in developing your content on the basis of ideas and products for your online business.

A good search engine optimization specialist has detail knowledge on different methods on search engine optimization. Bringing traffic to the website comes under his responsibility. He must understand the requirements of website owner whether web promotion and advertisement; he is responsible for handling all such spheres of web.

SEO Activity

M1

* Perform a thorough study of each of the CLIENT websites that are being optimized in terms of page content, key words being used, competition of key words etc.
* Perform research on including additional related keywords other than the key words mentioned by each client with the overall objective of optimizing each website for all important key words that best describe the client business.
* Prepare a keyword ranking report before start of optimization work on three popular search engines - Google, Yahoo and MSN and provide the report to the client.
* Perform On page Optimization activity for the website homepage only that includes tailored TITLE, ALT Image Tags, META Tags, and Header Tags etc utilizing the prioritized keywords.
* Review content of the homepage to include the required number of key words that describe the client business.
* Partial submission of key words to popular directories relevant to client business

M2

* XML Site Map Creation for quick and easy recognition of content and key words by search engines.
* Robots Text File (robots.txt) which helps spiders to know what to index and what not to.
* Reciprocal linking to build your link popularity campaign. This will be a recurring activity done across the entire SEO activity on the website.
* Link Building to achieve more back links. This will again be a recurring activity.
* Internal Linking. This will be a one time activity.

M3

* Directory Submission to Related Directories
* Directory Submission (Regular Activity)
* Reciprocal linking to build your link popularity campaign (Regular Activity)
* Keyword ranking report at the end of the third month SEO activity.

M4

* SEO Tweaking to see if the performed SEO activity is achieving desired results and do necessary changes for improving the ranking of the key words.
* Submit to Press Releases Sites
* Directory Submission (Regular Activity)
* Reciprocal linking to build your link popularity campaign (Regular Activity)

M5

* Directory Submission (Regular Activity)
* Reciprocal linking to build your link popularity campaign (Regular Activity)

M6

* Directory Submission (Regular Activity)
* Reciprocal linking to build your link popularity campaign (Regular Activity)
* Keyword ranking report (for the 5 selected keywords) at the end of 6 months SEO activity.
* Submission and Link reports

Sunday, August 31, 2008

Importence of Robots.txt File Working And Advantages Respect to Search Engine

What is the robots.txt file?

The robots.txt file is an ASCII text file that has specific instructions for search engine robots about specific content that they are not allowed to index. These instructions are the deciding factor of how a search engine indexes your website’s pages. The universal address of the robots.txt file is: www.example.com/robots.txt This is the first file that a robot visits. It picks up instructions for indexing the site content and follows them. This file contains two text fields. Lets study this robots.txt example :

User-agent: *
Disallow:

The User-agent field is for specifying robot name for which the access policy follows in the Disallow field. Disallow field specifies URLs which the specified robots have no access to. An example :

User-agent: *
Disallow: /

Here “*” means all robots and “/ ” means all URLs. This is read as, “ No access for any search engine to any URL” Since all URLs are preceded by “/ ” so it bans access to all URLs when nothing follows after “/ ”. If partial access has to be given, only the banned URL is specified in the Disallow field. Lets consider this example :

# Research access for Googlebot.
User-agent: Googlebot
Disallow:

User-agent: *
Disallow: /concepts/new/

Here we see that both the fields have been repeated. Multiple commands can be given for different user agents in different lines. The above commands mean that all user agents are banned access to /concepts/new/ except Googlebot which has full access. Characters following # are ignored up to the line termination as they are considered to be comments.

Working with the robots.txt file

1. The robots.txt file is always named in all lowercase (e.g. Robots.txt or robots.Txt is incorrect)

2. Wildcards are not supported in both the fields. Only * can be used in the User-agent fields’ command syntax because it is a special character denoting “all”. Googlebot is the only robot that now supports some wildcard file extensions.
Ref: http://www.google.com/webmasters/faq.html#12

3. The robots.txt file is an exclusion file meant for search engine robot reference and not obligatory for a website to function. An empty or absent file simply means that all robots are welcome to index any part of the website.

4. Only one file can be maintained per domain.

5. Website owners who do not have administrative rights cannot sometimes make a robots.txt file. In such situations, the Robots Meta Tag (http://www.seorank.com/meta-tags-article.htm) can be configured which will solve the same purpose. Here we must keep in mind that lately, questions have been raised about robot behavior regarding the Robot Meta Tag. Some robots might skip it altogether. Protocol makes it obligatory for all robots to start with the robots.txt thereby making it the default starting point for all robots.

6. Separate lines are required for specifying access to different user agents and Disallow field should not carry more than one command in a line in the robots.txt file. There is no limit to the number of lines though i.e. both the User-agent and Disallow fields can be repeated with different commands any number of times. Blank lines will also not work within a single record set of both the commands.

7. Use lower-case for all robots.txt file content. Please also note that filenames on Unix systems are case sensitive. Be careful about case sensitivity when defining directory or files for Unix hosted domains.

You can use The robots.txt Validator to check your robots.txt from www.searchengineworld.com.
Please note that the full path to the robots.txt file must be entered in the field.

Advantages of the robots.txt file

Protocol demands that all search engine robots start with the robots.txt file. This is the default entry point for robots if the file is present. Specific instructions can be placed on this file to help index your site on the web. Major search engines will never violate the Standard for Robots Exclusion.

1. The robots.txt file can be used to keep out unwanted robots like email retrievers, image strippers etc.

2. The robots.txt file can be used to specify the directories on your server that you don’t want robots to access and/or index e.g. temporary, cgi, and private/back-end directories.

3. An absent robots.txt file could generate a 404 error and redirect the robot to your default 404 error page. Here it was noticed after careful research that sites that do not have a robots.txt file present and had a customized 404-error page, would serve the same to the robots. The robot is bound to treat it as the robots.txt file, which can confuse it’s indexing.

4. The robots.txt file is used to direct select robots to relevant pages to be indexed. This specially comes in handy where the site has multilingual content or where the robot is searching for only specific content.

5. The need for the robots.txt file was also felt to stop robots from deluging servers with rapid-fire requests or re-indexing the same files repeatedly. If you have duplicate content on your site for any reason, the same can be controlled from getting indexed. This will help you avoid any duplicate content penalties.

Disadvantages of the robots.txt file

Careless handling of directory and filenames can lead hackers to snoop around your site by studying the robots.txt file, as you sometimes may also list filenames and directories that have classified content. This is not a serious issue as deploying some effective security checks to the content in question can take care of it. For example if you have your traffic log on your site on a URL such as www.example.com/stats/index.htm which you do not want robots to index, then you would have to add a command to your robots.txt file. As an example:

User-agent: *
Disallow: /stats/

However, it is easy for a snooper to guess what you are trying to hide and simply typing the URL www.example.com/stats in his browser would enable access to the same. This calls for one of the following remedies -

1. Change file names:

Change the stats filename from index.htm to something different, such as stats-new.htm so that your stats URL now becomes www.example.com/stats/stats-new.htm

Place a simple text file containing the text, “Sorry you are not authorized to view this page”, and save it as index.htm in your /stats/directory.
This way the snooper cannot guess your actual filename and get to your banned content.

2. Use login passwords:

Password-protect the sensitive content listed in your robots.txt file.

Optimization of the robots.txt file

The right commands : Use correct commands. Most common errors include - putting the command meant for “User-agent” field in the “Disallow field” and vice-versa.
Please also note that there is no “Allow” command in the standard robots.txt protocol. Content not blocked in the “Disallow” field is considered allowed. Currently, only two fields are recognized: “The User-agent field” and the “Disallow field”. Experts are considering the addition of more robot recognizable commands to make the robots.txt file more Webmaster and robot friendly.

Note - Google is the only search engine, which is experimenting with certain new robots.txt commands. There are indications that Google now recognizes the "Allow" command. Please refer:
http://www.google.com/webmasters/faq.html#12

Bad Syntax: Do not put multiple file URLs in one Disallow line in the robots.txt file. Use a new Disallow line for every directory that you want to block access to. Incorrect example :

User-agent: *
Disallow: /concepts/ /links/ /images/

Correct example:

User-agent: *
Disallow: /concepts/
Disallow: /links/
Disallow: /images/

Files and directories: If a specific file has to be disallowed, end it with the file extension and without a forward slash in the end. Study the following example :

For file:

User-agent: *
Disallow: /hilltop.html

For Directory:

User-agent: *
Disallow: /concepts/

Remember if you have to block access to all files in the directory, you don’t have to specify each and every file in robots.txt . You can simply block the directory as shown above. Another common error is leaving out the slashes altogether. This would leave a very different message than intended.

The right location : No robot will access a badly placed robots.txt file. Make sure that the location is www.example.com/robots.txt.

Capitalization : Never capitalize your syntax commands. Directory and filenames are case sensitive in Unix platforms. The only capitals used per standard are: “User-agent ” and “Disallow ”

Correct Order : If you want to block access to all but one or more than one robot, then the specific ones should be mentioned first. Lets study this robots.txt example :

User-agent: *
Disallow: /

User-agent: MSNBot
Disallow:

In the above case, MSNBot would simply leave the site without indexing
after reading the first command. Correct syntax is:

User-agent: MSNBot
Disallow:

User-agent: *
Disallow: /

The robots.txt file: Presence - Not having a robots.txt file at all could generate a 404 error for search engine robots, which could redirect the robot to the default 404-error page or your customized 404-error page. If this happens seamlessly, it is up to the robot to decide if the target file is a robots.txt file or an html file. Typically it would not cause many problems but you may not want to risk it. It’s always a better idea to put the standard robots.txt file in the root directory, than not having it at all.

The standard robots.txt file for allowing all robots to index all pages is:

User-agent: *
Disallow:

Using # carefully in the robots.txt file: Adding comments after the syntax commands is not a good idea using “#”. Some robots might misinterpret the line although it is acceptable as per the robots exclusion standard. New lines are always preferred for comments.

Using the robots.txt file

1. Robots are configured to read text. Too much graphic content could render your pages invisible to the search engine. Use the robots.txt file to block irrelevant and graphic-only content.

2. Indiscriminate access to all files, it is believed, can dilute relevance to your site content after being indexed by robots. This could seriously affect your site’s ranking with search engines. Use the robots.txt file to direct robots to content relevant to your site’s theme by blocking the irrelevant files or directories.

3. The file can be used for multilingual websites to direct robots to relevant content for relevant topics for different languages. It ultimately helps the search engines to present relevant results for specific languages. It also helps the search engine in its advanced search options where language is a variable.

4. Some robots could cause severe server loading problems by rapid firing too many requests at peak hours. This could affect your business. By excluding some robots that might be irrelevant to your site, in the robots.txt file, this problem can be taken care of. It is really not a good idea to let malevolent robots use up precious bandwidth to harvest your emails, images etc.

5. Use the robots.txt file to block out folders with sensitive information, text content, demo areas or content yet to be approved by your editors before it goes live.

The robots.txt file is an effective tool to address certain issues regarding website ranking. Used in conjunction with other SEO strategies, it can significantly enhance a website’s presence on the net.

Article Source :http://www.seo1.net/thread1534.html

Sunday, August 24, 2008

Uesful Analysis Tools For Keyphrase and Keyword Analysis

1.Google Adwords
2.Google Traffic Estimator
3 Google Trends Tool
4.Google Suggest
5.WordTracker
6.Overture
7.SEO Keyword Analysis
8.Keyword analyzer

For more Information visit

Keyword Analysis Tool

Tuesday, August 12, 2008

Keyword Analysis:Importent Role in SEO Services

Keywords play a Importent role in Search Engine optimization and website Promotion .your website can be listed in search engines only when the visiter or user types those keywords in serarch engine for find the information about his requirment.kwyword should be present in your website page content . so selection of keywords contain most importence in seo. Select those keywords that are most popular with the end users and that match with your website content.good keyword analysis is the first step in an effective search engine marketing campaign. If you don't chose good keywords, all efforts to boost your ranking will be wasted.

Brainstorming is a group creativity technique or process for generating the broadest possible list of keywords for your web site.Read your compititor web site and pick up alternate phrasing of common terms.don't forget that your customers are a great source of keywords.

Simply targeting the traffic is not enough, but the main concern should be getting the right relevant web traffic and search engine rankings on the best keywords and phrases for the website to gain online sales and enquiries.Wrong keywords means Wrong Optimization.Because if you choose keywords that no one is searching for, then your site will never be found. Or, if you choose keywords that are way too competitive, your site will have a difficult time making it to the top of the rankings.

Good Keyword Research = Better Web Traffic + Better Online Sales

For example use Analyze Keywords

Thursday, July 31, 2008

Search Engine Hated Techniques

There are some tact that are considered spam or called spamming techniques being used by spamer.

Mirror Sites/duplicate content / Duplicate Sites
Javascript Redirects Doorway pages
Dynamic Real Time Page Generation
Keyword stuffing and Page Swapping
Link stuffing and Link Farming
DHTML laying and Hidden text
Links inside No Script Tags
Use irrelevant Keywords
Use duplicate Domains
HTML invisible cells
Use Domain Spam
Redirects

Black Hat Techniques Used In SEO Hat Search Engine.

1. Hidden Content:

Hidden content will not be visible to the end user of the site.Comment tags look like this

!-- Comment Tag --

comment tag being used incorrectly

!—seo company india,seo india,seo services india,internet marketing india --

Use of the noscript tag.

The noscript tag should be used to inform a user that a script is being used but their browser either doesn’t support the script language used or.

similar ways include the noframes tag and hidden inputs in forms.

2.Meta Keyword Stuffing:

Two Meta tags that are generally used to inform search engines for content on the page. They reside between the head tag of a page The meta description should be used to describe the content of your page with 1 or 2 sentences, 3 at most.Usually webmasters will put repeated keywords toward the bottom of the page where most visitors can't see them. They can also use invisible text, text with a color matching the page's background. Since search engine spiders read content through the page's HTML code, they detect text even if people can't see it. Some search engine spiders can identify and ignore text that matches the page's background color.

3. Meta Keywords:

Meta Keywords should be a short words that inform of the main focus of the page.

4.Doorway Pages:

Doorway or Gateway pages are pages designed for SE and not for the user. They are basically fake pages that are stuffed with content and highly optimised for 1 or 2 keywords that link to a target or landing page. The end user never sees these pages because they are automatically redirected to the target page.

5.Page stuffing :

Webmasters first create a Web page that appears high up in search result and make a duplicate page also contain high result means make diffrent page with same content.Most search engine spiders are able to compare pages against each other and determine if two different pages have the same content

6. Link Farming:

Link Farming are popular black hat SEO techniques Because many search engines look at links to determine a Web page's relevancy, some webmasters buy links from other sites to boost a page's rank. A link farm is a collection of Web pages that all interlink with one another in order to increase each page's rank.

7.Cloaking:

In cloaking the content presented to the search engine spider is different from that presented to the users' browser. This technology is commonly used to present differing content to each search engine The purpose of cloaking is to mislead search engines so they display the page which would not be displayed. The website that uses clocking method is considered as spam

Monday, July 28, 2008

What is web 2.0 Technology and new added feature

Web 2.0 is a trend use in World Wide Web technology and web design.it is second generation of web-based communities and hosted services (such as social-networking sites, wikis and blogs) which aim to facilitate creativity, collaboration, and sharing between users.it includes a social element where users generate and distribute content, often with freedom to share and re-use. This can result in a rise in the economic value of the web to businesses, as users can perform more activities online.
web 2.0 techniques:
CSS to aid in the separation of presentation and content
Folksonomies (collaborative tagging, social classification, social indexing, and social tagging)
REST and/or XML.
Used Internet application techniques, often Ajax and Flash.
Semantically valid XHTML and HTML .
Syndication, aggregation and notification of data in RSS or Atom feeds .
merging content from different sources, client- and server-side .
Weblog-publishing tools .
wiki or forum software support user-generated content .
Internet privacy users to manage their own privacy in cloaking or deleting their own user content or profiles.
Web 2.0 feature, involving as it does standardized protocols, which permit end-users to make use of a site's data in another context (such as another website, a browser plugin, or a separate desktop application). Protocols which permit syndication include RSS (Really Simple Syndication — also known as "web syndication"), RDF (as in RSS 1.1), and Atom, all of them XML-based formats. Observers have started to refer to these technologies as "Web feed" as the usability of Web 2.0 evolves and the more user-friendly Feeds icon supplants the RSS icon.

Saturday, June 28, 2008

Relation With Mata Tag And SEO

Meta Tag is one of the very important task in SEO . Meta Tags are providing information about your website to the Search Engines and also to the Users. Unique and meaningful meta tags are useful for attracts user on our website. We are creating meta tags after the completion of Keyword Research .Meta tags are improve the Position of your web site page in all the search engines. They are just like a tool that will improve your position in search engines.Resons behind this mostly search engine look the body of your web site page that help for improve web site ranking.Geting the high search engine rankings put the main keywords in meta tags.
Do This Before Put the Meta Tags?

  • Select the relevant keywords.
  • All site content based on these keywords.
  • Title tag using the same keywords.
  • Description tag based on these keywords.

The meta tags in every page on your website should be different.You should spend as more time and effort that what visitors are typing into the search box at the search engines as what you spend on designing your pages.On the one page we carefully selected out META TAGS, on the other we did not.

1.META DESCRIPTION: Tag looks like

meta name="DESCRIPTION" content="The text will appear on website page."

The description must interesting and relevant.You should not go more than about 500 characters also keyword are not repeating otherwise it take spamming.

2.META KEYWORDS:This tag look like this:
meta name="KEYWORDS" content="All Keyword Related to web site".

Make sure your keywords are relevant to your web site page content and not repeated.
3.TITLE:The Title tag is look like

title "Secrets of META TAGS explained" /title
The most important tag is the TITLE. This is the heading people will see in the Search Engines and is what will make them click on your link or not. Be sure to make your Title as interesting and eye catching as possible.

4.Author:The name of the person who authored the page
meta name="author" content="Emory Rowland".

5.Revisit:How often to tell the spiders to come back
meta name="revisit-after" content="10 days".

6.Refresh: This field must contain a URL that refers the page to another link in a specified number of seconds
meta http-equiv="refresh" content="120; url=http://www.mywebsite.com".

7.Expires: The date when the content expires
meta http-equiv="expires" content="Wed, 15 Jan 2006 25:20:025GMT"

8.Robot : This tag is look like this tags.

meta name="robots" content="nofollow". the attribute of thi s tag is

All – Robots may traverse and index the page

No Index – Robots may traverse but not index the page

No Follow – Robots may index the page but not follow it

None – Robots may neither index nor traverse the page.

Friday, June 27, 2008

What is Off Page Optimization and Getting One Way Links Factors

Off Page Optimization is strategies for getting the traffic on your web page and one way link.maximize the performance of your website in search engines, these techniqe include the following thing .

  • One way link Exchange
  • Directory Submission
  • Article submission
  • Submission of classified adds
  • Press Release Submission
  • Social Network site submission
  • Rss submission
  • Atom submission of article
  • Fourm Posting
  • Blog Posting

Wednesday, June 25, 2008

Factors Of the On Page Optimization

1. Domain Name : Domain name that will allow you to include your most important keyword. always use domain name with keyword.

2. TITLE-Tag : Use different Title Tags for all pages The most important sentence on any web page is the title tag. The title tag gives the search engine a good indication as to what your page is all about. It not contain more than 10-12 words.

3. Header Tag: Used h1 tag for heading and h2 for sub heading.

4. Description Tag : The description tag is the most important paragraph on any page. It should contain not more than around 25-30 words.

5.Keyword Tag: It is a most importent part of on page optimization.it contain all the keyword related to your web site used by user when he search for particular information .it not more then 10 to 20 words and also not repeting for spamming point of view.

6.Alt Tags :Image alternate-text tags are used when image is part of a hyperlink.used keyword in image tage .

Tuesday, June 24, 2008

Type Of Web Page Optimization

1. On Page Optimization
2. Off page Optimization

On Page Optimization:On Page Optimization is a way of making your web site search engine friendly by making your code compliant with search engine standards. This process will not increase your PageRank but will make it easier for search engines to know what your webpage is about. Thus, making it possible for search engines to send targeted traffic to your webpage.

Friday, June 20, 2008

What is Search Engine

Search Engine is online database contain diffrent type of information related to all over world.where we find the information use of specified keywords and returns the list of documents.it has some component
1. search engine software
2.spider software
3.relevancy algorithim
there are alot of search engine used in the internet some are most ly used by user like
1.Google
2.MSN
3.Yahoo
4.Altavista

Working Of Search Engine Optimization

The first basic truth you need to learn about SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by e piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified. Sometimes crawlers will not visit your site for a month or two, so during this time your SEO efforts will not be rewarded. But there is nothing you can do about it, so just keep quiet.

What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one pages (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index to the search string.

There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, MSN, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.
Article Source:http://www.webconfs.com/seo-tutorial/introduction-to-seo.php

Search Engine Optimization

Search Engine Optimization (SEO) is often considered the more technical part of Web marketing. This is true because SEO does help in the promotion of sites and at the same time it requires some technical knowledge – at least familiarity with basic HTML. SEO is sometimes also called SEO copyrighting because most of the techniques that are used to promote sites in search engines deal with text. Generally, SEO can be defined as the activity of optimizing Web pages or whole sites in order to make them more search engine-friendly, thus getting higher positions in search results.

One of the basic truths in SEO is that even if you do all the things that are necessary to do, this does not automatically guarantee you top ratings but if you neglect basic rules, this certainly will not go unnoticed. Also, if you set realistic goals – i.e to get into the top 30 results in Google for a particular keyword, rather than be the number one for 10 keywords in 5 search engines, you will feel happier and more satisfied with your results.

Although SEO helps to increase the traffic to one's site, SEO is not advertising. Of course, you can be included in paid search results for given keywords but basically the idea behind the SEO techniques is to get top placement because your site is relevant to a particular search term, not because you pay.

SEO can be a 30-minute job or a permanent activity. Sometimes it is enough to do some generic SEO in order to get high in search engines – for instance, if you are a leader for rare keywords, then you do not have a lot to do in order to get decent placement. But in most cases, if you really want to be at the top, you need to pay special attention to SEO and devote significant amounts of time and effort to it. Even if you plan to do some basic SEO, it is essential that you understand how search engines work and which items are most important in SEO.
Article Source:http://www.webconfs.com/seo-tutorial/introduction-to-seo.php