Saturday, October 2, 2010

How to Optimize robots.txt for SEO?


    All search engines, or at least all the important ones, now look for a robots.txt file as soon their spiders or bots arrive on your site. So, even if you currently do not need to exclude the spiders from any part of your site, having a robots.txt file is still a good idea, it can act as a sort of invitation into your site.

What is a robots.txt file?

    A robots.txt file provides restrictions to search engine robots (known as “bots”) that crawl the web. Robots are used to find content to index in the search engine’s database.
    These bots are automated, and before they access any sections of a site, they check to see if a robots.txt file exists that prevents them from indexing certain pages.
    The robots.txt file is a simple text file (no HTML), that must be placed in your root directory, for example: http://www.yourdomain.com/robots.txt

Reasons for using a robots.txt file?
    
    There are 3 primary reasons for using a robots.txt file on your website:

  1. Information you don’t want made public through search
    In situations where you have content on your website which you don’t want accessed via searches, the robots.txt will prevent search engines from including it in their index.
  2. Duplicate Content
    Often similar content is presented on a website under various URLs (e.g. the same blog post might appear under various categories). Duplicate content can incur penalties by search engines which is bad from an SEO point of view. The robots.txt file can help you control which version of the content the search engines include in their index.
  3. Manage bandwidth usage
    Some website’s have limited bandwidth allowances (based on hosting packages). As robots use up bandwidth when indexing your site, in some instances – you might want to stop some user agents from indexing elements of your site to conserve bandwidth usage.
How to create a robots.txt file?
    The robots.txt file is just a simple text file. To create your own robots.txt file, open a new document in a simple text editor (e.g. notepad).
    The content of a robots.txt file consists of “records” which tell the specific search engine robots what to index and what not to access.
    Each of these records consist of two fields – the user agent line (which specifies the robot to control) and one or more Disallow lines. Here’s an example:
User-agent: googlebot
Disallow: /admin/
    This example record allows the “googlebot”, which is Google’s spider to access every page from a site except files from the “admin” directory. All files in the “admin” directory will be ignored.
    If you want only specific pages not indexed, then you need to specify the exact file. For example:
User-agent: googlebot
Disallow: /admin/login.html
    Should you want your entire site and all its content to be indexed, then simply leave the disallow line blank.
     If you want all the search engine robots to have access to the same content, you can use a generic user agent record – which will control all of them in the same way.

User-agent: *
Disallow: /admin/
Disallow: /comments/

How to find which User Agents to control?
    The first place to look for a list of the robots currently indexing your website is in your log files.
    For SEO purposes, you’ll generally want all search engines indexing the same content, so using “User-agent: *” is the best strategy.

If you want to get specific with your user agents, you can find a comprehensive list at http://www.user-agents.org/

    At Levonsys, we are constantly researching and innovating with new Search Engine Optimization techniques to ensure we are giving clients the best possible SEO services, and the best possible results. Our Search Engine Optimization goals may be ambitious, but we also make sure they are achievable by using tried and tested ethical Search Engine Optimization techniques. 

Levonsys

Email: sales@levonsys.com | Website: www.levonsys.com


Friday, October 1, 2010

The Importance of Sitemaps


    According to Sitemaps.org, sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional meta data about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

    Sitemaps serve two major functions:
  • To help users navigate your site
  • To help search engine crawlers navigate your website

    Moreover, having an updated sitemap on your website helps not only your visitors, but the search engines too. Consider sitemaps to be a unique way to communicate with the search engines and telling them where you'd like them to crawl on your website.

Sitemaps and Search Engines

     Sitemaps have been around for many years now and despite the fact that they are not a novelty, it's only after the search engines adopted them that they have suddenly caught the eye of all good webmasters as part of their search engine optimization strategy. If you simply want to use sitemaps for search engine optimization purposes, then conventional HTML site maps are not for you. For example: Google Sitemaps should be written in XML, which is very different from the regular HTML site maps that are used for website visitors.

     So does this mean that you need to have two sitemaps for your website? The answer is yes. One site map is for Googlebot or the Google spider, while the other is for your visitors. You should also be aware that having two sitemaps is not considered to be duplicate content and this has been made explicitly clear by Google, as it has stated that having two site maps will not lead to any penalty for your website.

Generating and Submitting Sitemaps To Search Engines

    Once you have your website ready with your XML site map, you need to upload your website to your server and then notify Google, through Webmaster Tools, where your sitemap lives . Typically, there are two ways of generating a site map. 
  1. You can install a sitemap generator script on your website  
  2. You can use an on-line sitemap generation tool to create one
         Though the first option is difficult it offers you more control over the final result. Google Sitemap Generator is a good tool that can be used for generating a site map. Since the tool is coded in Python script, you need to have Python 2.2 version on your web server to run it.

         The second method of generating a site map while being easier doesn't give you much flexibility. Google also suggests some 3rd party Sitemap tools but warns users that it has not tested or verified them. Having created a site map, it's time to upload it to your website and let Google, Yahoo and MSN know about it.

        Our SEO Professionals team at Levonsys performs marketing research, suggest the best search engine marketing channel like SEO, SMO, Google sponsored listings (paid listing) and how to make a SEO friendly website for your online business.

    Levonsys

    Email: sales@levonsys.com | Website: www.levonsys.com

    Wednesday, September 29, 2010

    Why the 301 Redirect is a must ?



    What is 301 redirect?

        301 redirect is the best method to preserve your current search engine rankings when redirecting web pages or a website. The code "301" is interpreted as "moved permanently". After the code, the URL of the missing or renamed page is noted, followed by a space, then followed by the new location or file name. 


                                                     Get robots to the right address
        One of the fundamental elements in search engine optimization is making the indexation of a website as easy as possible. That is why we create the Robots.txt file and the XML sitemap. I think the 301 redirect of the non-www to the www or visa-versa, is on the same level of creating ease for the search engines.

    Links to the Right Address 

        There are many who worry about not getting the full link benefit of redirected anchor text links. Doing the redirect can help would-be linkers know which is the proper address to use when linking. Getting the right link from the beginning is still ideal and should prove to be more beneficial.

     

    301 redirects are particularly useful in the following circumstances  

    • You've moved your site to a new domain, and you want to make the transition as seamless as possible.

    • People access your site through several different URLs. If, for example, your home page can be reached in multiple ways - for instance, http://example.com/home, http://home.example.com, or http://www.example.com - it's a good idea to pick one of those URLs as your preferred (canonical) destination, and use 301 redirects to send traffic from the other URLs to your preferred URL. 

    • You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.

        At Levonsys we use cutting edge, targeted Search Engine Optimization and Search Engine Marketing techniques to maximize relevant traffic to your web site. Our SEO strategies are dynamic and designed to produce long-term results.

     

    Levonsys

    Email: sales@levonsys.com | Website: www.levonsys.com

     

    Tuesday, September 28, 2010

    Optimizing the Title Tag for SEO



    What is a Title Tag?

        A title tag is a piece of HTML code that describes a specific web pages content through a keyword query that a person types into a search engine. Title Tags are a very important guide for all search engines in determining what is in the content of a specific web page. Creating a relevant title tag is one of the most important variables in achieving high search engine positioning.

        For most search engines, the maximum length of a title tag to be displayed is between 60-70 characters. If your title tag is over 70 characters, your title will be cut off around 70 characters on the search results page. Search engine spiders use these title tags as the main source for determining the page topic. Spiders or crawlers examine the title and then translate the topic of the page. This is one reason why it is always best to use your keywords in the page title, and to place them as close to the beginning of the title as possible. Remember, the text included in the title tag is also the text that will appear in the SERPs (search engine results pages) as the linked title on which users will click to access your page. In fact, just fixing the title tags of your pages can often generate quick and significant improvements in your rankings.


        Keyword Performance hopes you have found this information a helpful tip in building an effective search engine optimization marketing strategy. Feel free to utilize any other information within our website to help with a better understand and maximize your search engine performance.

        For any successful online business, a significant presence in search engines like Google, Yahoo, MSN is a must. At Levonsys we Optimize your website and Increase the traffic. We make your website appear in the top positions in the results of major Search Engines.

    Levonsys

    Monday, September 27, 2010

    Importance of Anchor Text in SEO



        Google heavily weight it’s Search Engine Results Pages (SERPs) towards the anchor text of links to a page. This can be demonstrated by looking at extreme examples where a pages high ranking can only be attributed to anchor text and no other SEO factors.

        Anchor text usage should reflect the content (SERP) the link is linking to and ideally help the page it is linked from as well. This means if you can link highly related pages together they will tend to do much better in their respective SERPs.

        Search engines give considerable heaviness to the anchor text on your web pages. Savvy website or blog owners will write articles to distribute among the different article directories and thus secure lots of back links to their site, External anchor text optimization is accomplished through link exchanges, blogs & forums, and article directories, among others. Anchor text, in a nutshell, is the text attached to a link that points to your site. Link building is an important part of building a strong website presence and is needed to get ranked highly within the result pages of popular search engines like Google.

    Creating an Anchor Text Link

        It is important to note that all online services will not let you post anchor text links. For example, Twitter will not let you post anchor text links. However you should be able to post an anchor text link on any standard web page online. The following is the proper way to create an anchor text -

    <a href= “websiteurl.com”>Anchor Text</a>

        Using that simple string of code you can make a link appear as any word you’d like, thus boosting your search engine ranking for targeted keywords.

        The display of an anchor text link in an article is often underlined and usually colored differently from the rest of the text; Anchor tag optimization can be deployed throughout your site to enhance the relevance of most of the web pages. Search engines follow anchor text to other parts of the Web. If you submit an article to twenty different article sites, for example, with the same anchor text it may look like you are trying to manipulate the search engines. This is where your keyword strategy meets your Link building strategy. An addition variable when using hyper linked text that bears mentioning is the no follow status of the website. Anchor tag should give your readers valuable information about the content of the page you're linking to. The best usage for anchored text would be for external links and within your website pages. Writing free reprint articles is a great way to drive traffic to your sites and increases your business sales.

        Levonsys offers Complete Marketing Solutions for your online business. We offer services like professional SEO, Link Building Services, SEM, and Social Media Optimization.

    Levonsys