468x60 Ads

Here are Some Tips and Tools on Search Engine optimization:-- 

SEO is a very specialized and a huge topic to be fully covered in such a small article but still few tips about SEO are given below , categorized in four categories
  1. Content Quality.
  2. Web site Structure.
  3. HTML Guidelines.
  4. Search-friendly URLs.
  5. Off Page Submissions like blogs and Forums submissions, Press release submissions, Article submissions, Directory submissions etc.
  6. Social bookmarkings and Link Buildings.
  7. Meta tags ie. Title, Keywords and Description Optimization. 
Good content is the key Factor for search engine optimization (SEO) : we have to make sure of the quality of content. 

While designing an application and populating content we have to think of users first and then for the search engine and also make sure that the website has good quality content because most of search engines also think like this. Some tips for the creating a quality contents are given below
1. Identify original, unique, useful words and their synonyms as keyword(s) and phrases for describing each page i.e. selected words should be concise and specific to page. 

 keyword(s) / phrases can be further be used in various HTML (for SEO) tags. There are various tools which are helpful in finding the right keywords for pages. For e.g. AdWords Keyword Tool , Keyword Discovery, SEO Book keyword tool , Keyword Box, Yahoo Keyword tool, Word tracker , Google keyword Tool, etc. 

It is advisable to
 use phrases to describe the page instead of just single words description and there are couple of reasons for this.
a. While performing searches people are now day’s searches for phrases or set of words instead of just one word.
b. Competition is too high for Single words
c. This helps in differentiating our web page or site from competitors.
1. Make sure the site confirms to the W3C standards and W3C validator tools can help in achieving standardization 
2. Make sure the site hierarchy is flat and navigation is simple. Content web pages should not be more than three clicks from home page.

 Categorize web pages – Better the structure of site, easy it is to target the market. Structure of the site always plays a key role in SEO. So before actually start building the site it advisable that one should carefully plan for the structure of site i.e. how actually the web pages would be categorized. For e.g. if you are in business of HR consultancy to different type of industries then try to create separate structure for each industry describing your offerings, specialization related to that industry and incorporate very specific keyword(s)/ phrases for the same..
4. Provide Web feeds (a.k.a. syndicated feed.) - A web feed is a document or communications channel which contains content items with web links to longer versions. It is a mechanism to share content (not visual representation) over the web. Websites or applications subscribe these feed and then render the content in required layout. Some of the widely used web feed techniques are RSS, ATOM .

 RSS (Really Simple Syndication) – A XML based content used to publish frequently updated content like news, blogs etc. RSS allows to link not just to a page, but to subscribe to it, with notification every time that page changes so the information is updated in a automated manner. It can contain summary of content from an associated web site or the full text. For more please readRSS Wiki
b. ATOM – It is also an XML-based content and metadata syndication format used for publishing frequently updated content like news, blogs etc. Atom is developed to overcome many incompatible versions of the RSS syndication format, all of which had shortcomings, and the poor interoperability. For the list of difference in these two formats go through http://www.intertwingly.net/wiki/pie/Rss20AndAtom10Compared and Atom Wiki
There are lots of free online and downloadable RSS and ATOM generators and convertors are available for e.g. rssfeedssbumit that can be used. W3C also provides one validator tool for RSS and ATOM W3C RSS/ATOM Validator
5. Add Sitemap in your website. A Sitemap is a file (.xml, .htm, .html, .txt.) which contains structured lists of URLs for a site which allows intelligent and smooth crawling for search engines. Sitemaps are a URL inclusion protocol and complements robots.txt (a URL exclusion protocol). In addition to Standard independent ROR XML format most of the search engine support other formats like RSS, ATOM, Sitemap for e.g. Google, Yahoo and Microsoft also support sitemap protocol. It is advisable to keep numbers of link in a site map within 100, if it is not feasible then break the site map into separate pages. Brief introduction to both ROR and sitemap protocol is given below
a. ROR (Resources of a Resource) is an independent XML format for describing any object of your content in a generic fashion, so any search engine can better understand the content. Think of ROR feed as a powerful structured feed for describing all objects to the search engines: sitemap, products, services, reviews, feeds, discounts, images, events, schedule, podcasts, archives and much more. Tools like ROR Feed Generator, XML-Sitemaps , ROR Sitemap Generator makes it easy to create ROR feed. For information on ROR 

b. Sitemap protocol is an XML format file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site. For more info on sitemap protocol please visit http://www.sitemaps.org/ , XML-Sitemaps,sitemap pals , Google Sitemaps,
Sitemap information is typically placed in website's main directory for e.g. ROR feed is stored in ror.xml, sitemap in sitemap.xml in main directory. So it would be good if every website has ror.xml, sitemap.xml and list of search engine specific sitemap files for e.g. urllist.txt (for yahoo).
6. Robot exclusion techniques - Robot.txt is a file which permits or denies access to robots or crawlers to different areas of website. It’s a protocol that every spiders/crawlers/ bots first look for this file in the main directory of website and then based on information available in this they should proceed further. But still there may be some spiders/crawlers/bots that overlooks this file and continue as they want. For more info on robots.txt please visit : http://www.robotstxt.org/
7. Appropriate handling of HTTP Status code – Dealing rightly with HTTP status codes helps not only helps in preserving the link equity and also to avoid getting delisted from search engines 
a. 404 Status code - With the age of site there are changes that some of the information pages are not available in due course of time because of many reasons say non-relevance of content or offering are no more available etc. In this case instead of displaying “404 Page Not found” error it is advisable to either redirect the request to the related page or display customized message. Following this would help in preserving the link equity.

 303 and 302 Status code - Avoid multiple redirection chaining. By redirection here we means when some page is requested by a user then it is redirected to a different page because of the change in URL (HTTP code 301, 302) or automatic redirection can also be done. So the idea is to avoid a scenario which involves multiple redirections for one request like Request for Page A, which is redirected to B, then redirect to C and so on and finally after a couple of redirections final page is displayed..

 500 Status code – It is always advisable in case of downtime or non –availability of resources or site to return HTTP “500” status code with relevant message instead in displaying “404 page or blank page” or page with full of db connection errors or unable to access resource errors it is advisable to r because of which the search engines does not index or re-index these page or site and will not delist the page or site.
Developing search friendly webpages is both an art and science and it requires conscious efforts. In order to a make webpage a search friendly there are certain things that be should be done and certain things which should be avoided. In this section we will discuss both these aspects side by side but firstly we see what all things should de done.

1. Make sure that every page is a well-formed HTML or XHTML page. Some of validator tools likeW3C validator tool,
 SEO Worker validator which are available which can be used.

2. Size of page should neither should be too small (<5k)>15K). Try to limit it to a reasonable size.

3. Make sure all key information and business pages including home page are fast loading. Try to restrict the page loading time within or around 8 sec. One of the way improving page loading time is use of optimized images

4. Try to restrict the number of links on a given page to a reasonable number (say <=100).

5. Usage of keyword(s)/Phrase in site navigation structure.

 Title Tag – Define title tag for every single page as this is a one of most import SEO technique .Use unique, original strongest and exact keyword(s)/phrase that best describe the web page. For high page ranking, it is required to carefully choose keywords/phrases that should be a part of the page title:

7. Meta Tag Usage -Author, Description, and Keywords Meta tag should be used on every page and each of these tags has its own significance and role. The Meta description tag doesn’t help much in improving page ranking 
a. Author Meta Tag –Always define this tag with name of the company that owns the site as this helps to get no #1 position in search results if it contains company's name:
b. Description Meta Tag - Use the description tag to describe the page is about. The content of this tag often appear as the text snippet below search result listing when displaying a list of links..
c. Keywords Meta tag - Keywords meta tag helps search engines to categorize webpage and site with which will user can easily and quickly find pages. Optimized usage of quality keyword(s) is advisable as some search engines puts limits on the numbers of Meta keywords.
d. The Robots META tag - It is to indicate to the visiting robots if a page can be indexed, or used to harvest more links. For this no server administrator action is required and as of now only few spiders/bots are following this.

8. Heading and Bold Tags
 – Usage of relevant keyword(s)/phrases in heading tag also makes a good contribution for SEO. Use of relevant keyword(s) in heading tags (H1 , h2 h3,..) is suggested and it would be good if we restrict the usage of h1 tag to one per page. Sequence of heading should as such that H2 follows h1 and so on. Similarly also try to use all the relevant keyword(s) /phrases in bold tags (, , ) at least once in a page. 

9. ALT Tags – Use relevant Keyword(s)/phrases at relevant places in ALT tags, but don't over do it because it could results in dropping of page from search results or even worse banned for life. The use of this tag becomes more important when link is created for image.

10. Please make sure that keyword(s) /phrases we want to indexed should be placed be as part of Image file name and if not possible the try to use then outside of images and with in ALT tags.

11. Use identified keyword(s)/ phrase of the linked page (page to which you are linking) for your hyperlink text.

12. Anchor text - Use keyword(s) /phrases as anchor text when linking page internally. This helps spiders what the linked-to page is about. Generic statements “click here" or “view this” or “See this” didn’t add any value to SEO.

D. Search-friendly URLs

SEO techniques are not just limited on what information is to be placed or where is to be place or how is to be placed but also includes how it is accessed. WebPages are accessed using URL and with every day more and more websites are coming up which also have some dynamic pages in addition to static HTML pages. As we know dynamic pages are pages which get generated based on certain parameters passed to them. It is required that we should also make these dynamic pages SEO friendly because they also contain vital information’s that should be searchable. In addition to above given guidelines we also need to make Dynamic page URL a search friendly URL. Generally spiders/bots doesn’t crawl pages which had certain characters like “?” i.e. they just don’t crawl pages which accepts query parameters but still up to certain level we can still make a search friendly URL and tips for the same are given below.

1. Uses identified keyword(s)/phrases in URL of page and filenames and replace the id with text names (keywords) in URL for e.g. URL http://www. myexample.com /product/Mens_Short_brown is preferred over http://www. myexample.com /productPage?id=122 even when both are pointing to same.

2. Avoid using parameters like 'sessionid' and 'id' if possible as some spiders/bots just ignore them.

3. Try to limit the number of parameters and length of URL as some spiders/bots puts certain restrictions on these.

4. Every page should be reachable from at least one static text link.

E. Off Page Submissions

Off page Submissions Includes Various Submissions like Blogs Submissions, Forums Submissions, Article submissions, Press release submissions etc.

Off pages is a very Crucial Component of Seo can be called as the left hand of SEO.

Generally these Submissions help a web site to improve the Rankings. So its very important at the SEO aspect.

F. Social bookmarkings and Link Buildings

Social Bookmarking and Link Building also helps a website to improve its Rankings. Both these Creates Back links to the website that is very crucial for the SEO and Page rank lifting.

G. Meta Tags

Metas like Title, Keywords and Description are the most Crucial Component for the website. It is one of the most important thing that the Spiders grabs. So a right Title, Keywords and Description makes the way easier for a website to be ranked.

There are different criteria for different Search engines for Meta tags. Google takes 69 to 71 Char( with space) in the Title and 160 char for the Description.

similarly other search engines have their own norms.