Thursday, July 31, 2008

Search Engine Hated Techniques

There are some tact that are considered spam or called spamming techniques being used by spamer.

Mirror Sites/duplicate content / Duplicate Sites
Javascript Redirects Doorway pages
Dynamic Real Time Page Generation
Keyword stuffing and Page Swapping
Link stuffing and Link Farming
DHTML laying and Hidden text
Links inside No Script Tags
Use irrelevant Keywords
Use duplicate Domains
HTML invisible cells
Use Domain Spam
Redirects

Black Hat Techniques Used In SEO Hat Search Engine.

1. Hidden Content:

Hidden content will not be visible to the end user of the site.Comment tags look like this

!-- Comment Tag --

comment tag being used incorrectly

!—seo company india,seo india,seo services india,internet marketing india --

Use of the noscript tag.

The noscript tag should be used to inform a user that a script is being used but their browser either doesn’t support the script language used or.

similar ways include the noframes tag and hidden inputs in forms.

2.Meta Keyword Stuffing:

Two Meta tags that are generally used to inform search engines for content on the page. They reside between the head tag of a page The meta description should be used to describe the content of your page with 1 or 2 sentences, 3 at most.Usually webmasters will put repeated keywords toward the bottom of the page where most visitors can't see them. They can also use invisible text, text with a color matching the page's background. Since search engine spiders read content through the page's HTML code, they detect text even if people can't see it. Some search engine spiders can identify and ignore text that matches the page's background color.

3. Meta Keywords:

Meta Keywords should be a short words that inform of the main focus of the page.

4.Doorway Pages:

Doorway or Gateway pages are pages designed for SE and not for the user. They are basically fake pages that are stuffed with content and highly optimised for 1 or 2 keywords that link to a target or landing page. The end user never sees these pages because they are automatically redirected to the target page.

5.Page stuffing :

Webmasters first create a Web page that appears high up in search result and make a duplicate page also contain high result means make diffrent page with same content.Most search engine spiders are able to compare pages against each other and determine if two different pages have the same content

6. Link Farming:

Link Farming are popular black hat SEO techniques Because many search engines look at links to determine a Web page's relevancy, some webmasters buy links from other sites to boost a page's rank. A link farm is a collection of Web pages that all interlink with one another in order to increase each page's rank.

7.Cloaking:

In cloaking the content presented to the search engine spider is different from that presented to the users' browser. This technology is commonly used to present differing content to each search engine The purpose of cloaking is to mislead search engines so they display the page which would not be displayed. The website that uses clocking method is considered as spam

Monday, July 28, 2008

What is web 2.0 Technology and new added feature

Web 2.0 is a trend use in World Wide Web technology and web design.it is second generation of web-based communities and hosted services (such as social-networking sites, wikis and blogs) which aim to facilitate creativity, collaboration, and sharing between users.it includes a social element where users generate and distribute content, often with freedom to share and re-use. This can result in a rise in the economic value of the web to businesses, as users can perform more activities online.
web 2.0 techniques:
CSS to aid in the separation of presentation and content
Folksonomies (collaborative tagging, social classification, social indexing, and social tagging)
REST and/or XML.
Used Internet application techniques, often Ajax and Flash.
Semantically valid XHTML and HTML .
Syndication, aggregation and notification of data in RSS or Atom feeds .
merging content from different sources, client- and server-side .
Weblog-publishing tools .
wiki or forum software support user-generated content .
Internet privacy users to manage their own privacy in cloaking or deleting their own user content or profiles.
Web 2.0 feature, involving as it does standardized protocols, which permit end-users to make use of a site's data in another context (such as another website, a browser plugin, or a separate desktop application). Protocols which permit syndication include RSS (Really Simple Syndication — also known as "web syndication"), RDF (as in RSS 1.1), and Atom, all of them XML-based formats. Observers have started to refer to these technologies as "Web feed" as the usability of Web 2.0 evolves and the more user-friendly Feeds icon supplants the RSS icon.