Showing posts with label search engines. Show all posts
Showing posts with label search engines. Show all posts

Friday, February 10, 2012

15 Ways to Leverage Off Page SEO for Your Search Engine Rankings

One of my favorite sites for SEO is SEOMoz.org. I read it every day and there are some awesome tips to be shared and yesterday was no different. I have been learning about the concept of off-page SEO. What is that you ask? It is exactly what it sounds like doing things off of a web page to improve web site results.

We took a greater list from SEOMoz.org of 21 Ways for Leverage Off-Page SEO and we found that there are 15 Ways that really work best. Our value is trying things and failing miserably so you can Below is a list of some things most people may be familiar with, but I have also added a few advanced things that you may not know. Try these Advanced Off-Page SEO Strategies to market your website, get ranked in search engines, and to build online reputation (branding) for your company/website so that you can survive in this competitive SEO world.

1). Community Creation in Social Networking Sites

Also known as online reputation management, this is the first and foremost step with which you have to initiate your process. Try to become a member of the most popular social networking sites like Orkut, Myspace, Facebook, Linked In, Ecademy, etc., and create a profile of your own. By doing this you can extend your network online, get connect with your friends, share things with each other, and promote your company/website to build an online reputation. This is most likely the same as Web 2.0 (Participatory Web), which means you have to show your active participation on a regular basis.

2). Blogging

This is one of the most powerful ways to promote your company/website online. Write a blog of your own for your company/website and include lots of unique content. Be precise in what you’re trying to convey for the users in your blog entry and promote your blog in blog directories and blog search engines. You can also promote your blog/website by posting comments in other service-related blogs which allow links in the comments section that are crawlable by the search engines (these blogs are commonly identified as Do-Follow Blogs). If you’re not very good at writing content for blog posts, hire a guest blogger for your blog and ask him/her to write precise and unique content so that your blog can gain more credit from a search engine point of view.

3). Forum Postings

Create a forum/online discussion board of your own and start a discussion or share topics with your friends. You can also post/reply to a thread in other service-related pre-existing forums that allow links in your signature which can be crawled by the search engines (aka “Do-Follow Forums”).

4). Search Engine Submission

Submit your website to the most popular search engines like Google, Yahoo, MSN, Altavista, Alexa, Alltheweb, Lycos, Excite, etc., to get listed for free.

5). Directory Submission

Many people may say that directory submission is dead. As far as I’m concerned it is still alive. It is purely based on how effectively we are selecting those directories and how efficiently we are choosing the category for submission. Of course, I agree that it gives quite delayed results, but it is worth doing it. Submit your websites to the topmost quality directories like DMOZ, Yahoo Directory, ZoomInfo, One Mission, Pegasus, etc. Nowadays many web directories may offer paid listings but don’t go for it.

6). Social Bookmarking

Social Bookmarking is yet another powerful way of promoting your website, but nowadays most people are spamming social bookmarking sites without knowing how to use them. Since content in these websites update frequently, search engines like these types of sites and often visit them (this is commonly termed as Tagsonomy & Folksonomy in Web 2.0). Do some social bookmarking in popular bookmarking sites like Digg, Delicious, StumbleUpon, Propeller, etc. You should be very careful while doing this and you must properly handle the tags which are very essential to broadcast your news on a wide area network. This may increase your website traffic based on how effectively you have participated.

7). Link Baiting

Suppose you have copied/published another website’s news or content in your blog/website. Don’t forget to place their website link as a reference. Do it for others and, if your content is trustworthy, let others do it for you. This is another way to increase your link popularity.

8). Cross-Linking

Link to internal pages within your site wherever necessary (this is commonly termed Internal Linking). This increases your internal link popularity ,which is another major factor of Google Page Rank algorithm. The best known example of successful internal linking is Wikipedia. Also try to get a content link from websites/blogs that are related to your site theme. Try getting a link from within their site content using a targeted keyword as anchor text (much like Wikipedia does). We know that this strategy can often be hard to implement, but these types of links have more weight from a search engine point of view.

9). Video Promotions

Like with photo sharing, you can publish/share your product videos, expert opinions, and reviews of your product and make them public in YouTube, Metacafe, Dailymotion, etc.

10). Business Reviews

Write reviews about others businesses or ask your friends/clients to write a review of your business in major business review sites like RateitAll, Shvoong, Kaboodle, Stylefeeder, etc.

11). Local Listings & Yellow Pages

Instead of going global and facing huge competition, make your website local so that search engines can easily view your website and fetch the content. This will help you to reach a targeted audience. Submit your website to Google Local, Maps, Yahoo Local, Yellow Pages, Superpages, Hotfrog, etc.

12). Article Submission

Write articles of your own and submit them to popular article sites like Ezine, Go Articles, Now Public, Buzzle, etc. This will help you to attain some deep links for your website (though it’s usually a slower process).

13). Press Release Promotion

If you are a business/service provider then go for PR submission in popular PR websites like 1888pressrelease, Open PR, PR Leap, etc. This will help you to publish your site in Google News.

14). Answers

Participate in Answers by asking and answering relevant questions and placing a link to your website in the source section if necessary. If you don’t spam, this is another great way to increase your link popularity (Yahoo Answers, Cha-Cha, Answer Bag, etc.)

15). PPC Ad Campaign

When none of the above strategies work for you, go for a PPC ad campaign with your targeted keywords. Remember that you have to pay to drive more traffic towards your website through PPC.
--------------------------------------------------------------
Source:http://blog.networksolutions.com
--------------------------------------------------------------

What is Robots.txt

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

“User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.
----------------------------------------------------------------------------------
Article Resource:http://www.webconfs.com/what-is-robots-txt-article-12.php
----------------------------------------------------------------------------------