image image image image image image image image image
I checked the link quality and I can say it's top quality, all of the comments I came across are very targeted to the blog niche as well. Apart from the very offerable and high quality blog commenting service Joe offered, I'm also very impressed with Joe's Customer Support. I made a mistake with the keyword and he help me solved it within 4-6 hours. Strongly recommended for people who want link diversification. Note: My result for keyword "article writing" has jumped from position 4 to position 1 during my last check thanks to you. This keyword is a very competitive keyword were I have combined the use of Joe's services. Link diversify is the secret ingredient to position 1.
- Winson Yeung
Just received my report, random checked about 30 and found my link with an *intelligent comment* on all. Very pleased with Joe's work. This was just a "test" order of 250 links to see what kind of job he would do and now placing 3000 link order for other sites.
- Gary Fritts
I ordered 500 links from these guys and within days of receiving my completed report several links were showing up in Yahoo site explorer. Meaning they got indexed in just a few short days. It took a big longer than expected, however they completely made up for it by giving me extra links. Their customer service is awesome and I highly recommend them to anyone. I just ordered another 500 links for another site of mine. I will continue using them in the future. This is the first blog commenting service I've used that actually gets results in days, not months.   
- Mandy
I have been a customer, for a while. I also have a recurring posting it place. With a particular domain, I couldnt get it on the top 10, for a very key term (super competive in my market) I used one of Joes top packages (with PR page links) within 30 days, the site was in the top 10, which it has stayed for serveral months....... i use him as my "backup" plan for competive marketing for my clients....
- Paul
I am IMPRESSED!!!! I have just received over $670.00 worth of commissions this month from my website that was promoted by you last month. I am sure this was due to your SEO work because this site was sitting there for the last 2 years generating ZERO profits. I am glad that I decided to resurrect that website from the dead using your services.
Could not have made it without you. Thanks a bunch.
- Alejandro
 Wanted to shout out that UltraSEOSolutions is top notch! I ordered 100 links and got 150 just in case some of them dropped out. Its been about a week and several backlinks have already been indexed and verified. The service is also very fast. They even mailed me first just to make sure i received everything well. I recommend this to everyone. Very trustworthy and quality service!! Will be buying new links soon definately.
- David Bell 
 Well i have to say amazing service, this company just keeps on delivering order after order and does not stop to amaze, highly recommend this company for link building efforts. A thumbs up from me again.
- Sean Dyke 
I can see that lots of links got cached which is pretty awesome! Now I have climbed from Page 8 to the bottom of Page 1 for my medical website.. I can see that my earning have multiplied by 4 today which is pretty exciting as I guess this website will have so much potential once it gets updated with new products.- Bill Keegan 
 UltraSEOSolutions definitely know what they are doing here. I must say, I have tried lots of other providers that were actually reselling UltraSEOSolutions's services at double the price. I am glad I finally tracked you guys, you are a life saviour.My profits have multiplied since the first week I implemented your services to my websites. I have SEO crossed off my list and I feel right at home with the prompt services and friendly communication from Joe Fares.
- Mary K 
Keyword Research as Easy as 1-2-3




Keyword Research as Easy as 1-2-3

June 22, 2012 Bloggies by Administrator

Google's AdWords Keyword tool is a common starting point for SEO keyword research. Not only does this free tool from Google suggest keywords but is also provides estimated search volume for each keyword which is an essential part  to guarantee you get the relevant organic traffic you are looking for to generate sales, conversions, leads or clicks.

1) To determine volume for a particular keyword, be sure to set the Match Type to [Exact] and look under Local Monthly Searches. Remember that these represent total searches.



2) Now enter some keywords within the "Word or Phrase" field then click Search.


3) That is not the end. You also need to check the competition you will be dealing with! While there are literally 100's of tools out there, I will focus on the easiest free solution just to give you a rough value. Go to and do a search on Google for your keyword within quotations. The number of pages your search will yield is a good estimate of the competition. 

4) Now you found your perfect keyword(s)? You are almost ready to get started. 

You will need to focus on some basic ON-PAGE SEO Optimization by having these keywords noted within your website's content (Specifically, the page you will be promoting for that keyword whether it will be your homepage or an internal page)

Make sure the keyword is included within you content at a ~2% ratio along with having the keyword included in your Page Title and h1 tags. 

I tried to make this as short and easy to implement as possible without being required to pay a dime for any SEO tools/software. While there are 100's of different uncovered points when it comes to keyword research, the above is more than enough to put you on the right track of keyword research.

Good luck :)

Other useful references:


Read More

Digg!!Facebook!AddThis Social Bookmark Button
How to Get Links from Wikipedia




How to Get Links from Wikipedia

January 05, 2012 Bloggies by Administrator

Might not be as as powerful as they once were, but Wiki sites are still an outstanding way to gain valuable backlinks to your site.

In a web ruled by links it is essential that your domain has a range of topic related sources pointing back to your domain. One quick and relatively easy way to do this is to use Wiki sites to stream relevant traffic and build links (with appropriate anchor text).


What are Wiki Sites?

A Wiki is similar to a free encyclopaedia where you can create and edit pages on any given topic. Most Wiki’s allow communities to write documents collaboratively. The idea of a Wiki page is to involve users in an ongoing review of the content.


Top Sites

There are a whole range of Wiki sites floating on the web. If you’re really lucky you may be able to find some industry related Wiki’s to create pages on. Below are a list of tried and tested sites that can help build links back to your domain.



So how do you go about getting a link?


Become a Trusted Editor

If you’re going to make the smallest tweaks to any content you need to build up an account with a history of high quality edits. If you offer value to the community, you are more likely to get the privilege of linking to your own site.


Adding Links

Once you have built up an account with a reputable history you can look at adding some links (this doesn’t mean spamming a chosen category with tons of links!).

One of the best ways to get a link is to create a page on a topic related to your business. The page should include valuable, unique and high quality information. The most common way to include a link is behind a citation. For example, you may be writing about mountain bikes and mention a type of brand, this can then be linked to your appropriate page.


Internal Links

Another way to build up the quality of you backlinks is to implement a range of internal links from other appropriate Wiki pages. For example, if a link back to your site has five links from other Wiki pages this will increase the authority of the source.


Are they worth the hassle?

Many Wiki links are nofollowed (not that this matters in my opinion) – the links are picked up in most analysis kits including Google Webmaster Tools.  As with most quality links they have to be hard earned with good quality content.



Getting links from Wiki sites can be fairly straightforward if you provide valuable unique content. The more you can build up the trust of sites like Wikipedia the more you are likely to succeed. The other sites I mentioned are more lenient and are definitely worth incorporating into your link building strategy.

Source: How to get links from Wikipedia and other Wiki’s? by Oliver

Read More

Digg!!Facebook!AddThis Social Bookmark Button
2012 SEO Resolutions




2012 SEO Resolutions

January 01, 2012 Bloggies by Administrator

Your direction for 2012 is set (or not) in your ability to set clear, actionable goals. I hope I’m not bursting your SEO bubble when I tell you “increasing conversions” is no more of a clear goal than promising yourself to “lose weight” in 2012. Clear goals not only identify the end goal, they also break down the action steps you’ll need to achieve it. It’s for this reason that if you’re serious about bettering your Web site and your business in 2012 I recommend you start the year with a full SEO Analysis to help you understand what’s happening on your site so that you know WHAT needs to change and where you’ll need to invest in your site to make it happen. If you’ve never performed an SEO analysis on your site, let is be a bright 2012 with all the best business endevours by focusing on SEO to kick in these conversions.

Read More

Digg!!Facebook!AddThis Social Bookmark Button
Promote the WWW version or the non-WWW version?




Promote the WWW version or the non-WWW version?

August 08, 2011 Bloggies by Administrator

During the past few months, I have been asked by quite a few clients:

"Do I build links to my WWW version ( or to my non-WWW version ("

It is EXTREMELY important to have a proper redirect between your WWW version and non-WWW version of your website to guarantee:

 1) You are not loosing any link juice.

 2) Google does not see your website as TWO different versions. Yes, Google will pick one version and ignore the other version thus putting all your link building offers to waste to the un-indexed version. Besides, you do not want to be facing duplicate content issues against your own website.

 To overcome this issue:

1) login to your Google Webmasters tool and choose the preferred version you would like to go for.

2) Set up a permanent 301 redirect via your .htaccess file (Usually located on the ROOT folder of your hosting account)


Redirect www to non-www:

RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^ [NC]

RewriteRule ^(.*)$$1 [L,R=301] 

Redirect non-www to www:

RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^ [NC] 

RewriteRule ^(.*)$$1 [L,R=301]


  Is there any preference choosing when choosing WWW over the non-WWW version or vice versa?

Actually, there are no preferences whatsoever if you have made a solid 301 redirect as you will be getting all the link juice whether the WWW or non-WWW version is promoted.


This might sound like a hassle for newbies, but the changes are extremely easy and only require a few minutes. 

Read More

Digg!!Facebook!AddThis Social Bookmark Button
40 Most Important Factors in SEO




40 Most Important Factors in SEO

June 25, 2011 Bloggies by Administrator

Here is the list of the most influential factors on SEO. Sorted by Priority:

1 * HIGH * Title Tag - Description of Website
2 * HIGH * Domain Name
* HIGH * H1 tag or first headline of document content
4 * MID * Description Meta Tag
5 * MID * Keyword Meta Tag (Not emphasized by Google, but needed for Yahoo and Bing)
6 * MID * Body of Text Italic
7 * MID * Body Of Text Bolded
8 * MID * Body of text generic
9 * LOW * Keyword Density 5-20%
10 * HIGH * Latent Symantec Indexing - Related words on topic
11 * MID * Sub headlines H2, H3 etc
12 * MID * Phrase order within the page
13 * MID * Keyword proximity of eachother
14 * MID * Font size +2 for sub-topics
15 * MID * Keywords within your alt text of image descriptions
16 * HIGH * Keyword early within the page
17 * HIGH * Keyword in links to other pages (on or off site)
18 * HIGH * Quality of other sites you link to
19 * HIGH * Topic of other sites you link to
20 * HIGH * Tree like structure of navigation
21 * MID * Internal links valid?
22 * HIGH * Number of links on the page itself (less is better)
23 * MID * Domain names you link to (Gov is best, then .edu, .org etc..)
24 * LOW * Web page size (IE: under 100k)
25 * HIGH * Hyphens in domain or file names more than 4 is bad
26 * MID * Page changing, more often updated pages prefered
27 * HIGH * Domain age
28 * MID * Page itself Age
29 * MID * Sites with more internal pages (IE: over 100 internal pages)
30 * MID * Page Theme
31 * MID * Frequency of updates themselves
32 * HIGH * Interesting title tag - Gets more SERP clicks than another
33 * MID * Appropriate links between itself and other pages
34 * LOW * Having a robots.txt
35 * LOW * Stating a physical address (Trust)
36 * LOW * Stating a support email address
37 * LOW * Describing every image
38 * LOW * Naming the images themselves thematically
39 * MID * Keyword in name of page itself
40 * MID * Document within a related folder or subdomain

Read More

Digg!!Facebook!AddThis Social Bookmark Button
Google's PANDA Update Explained




Google's PANDA Update Explained

May 26, 2011 Bloggies by Administrator

On Friday, Google officially wrote about their recent algorithm changes (commonly known as the Panda or Farmer update) for the first time, at least around what it hopes to achieve and how webmasters can better align their sites with the search engine's latest goals and philosophies.

The following is taken directly from a post by Google maven Amit Singhal, as posted to the Google Webmaster Central blog last Friday morning:

 In recent months we’ve been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we've rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?
Our site quality algorithms are aimed at helping people find "high-quality" sites by reducing the rankings of low-quality content. The recent "Panda" change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the "quality" of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

• Would you trust the information presented in this article?
• Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
• Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
• Would you be comfortable giving your credit card information to this site?
• Does this article have spelling, stylistic, or factual errors?
• Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
• Does the article provide original content or information, original reporting, original research, or original analysis?
• Does the page provide substantial value when compared to other pages in search results?
• How much quality control is done on content?
• Does the article describe both sides of a story?
• Is the site a recognized authority on its topic?
• Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care? 
• Was the article edited well, or does it appear sloppy or hastily produced?
• For a health related query, would you trust information from this site?
• Would you recognize this site as an authoritative source when mentioned by name?
• Does this article provide a complete or comprehensive description of the topic?
• Does this article contain insightful analysis or interesting information that is beyond obvious?
• Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
• Does this article have an excessive amount of ads that distract from or interfere with the main content?
• Would you expect to see this article in a printed magazine, encyclopedia or book?
• Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
• Are the pages produced with great care and attention to detail vs. less attention to detail?
• Would users complain when they see pages from this site?

Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do
We've been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you've been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We're continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term.

In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Read More

Digg!!Facebook!AddThis Social Bookmark Button
Google Algorithm Update Boosts Value of Quality Content




Google Algorithm Update Boosts Value of Quality Content

March 25, 2011 Bloggies by Administrator

The value of original, high quality web content continues to rise as Google makes new moves to decrease the visibility of low-quality websites. The search engine giant recently updated its algorithm to suppress the presence of link farms, which generate endless streams of poorly written, regurgitated articles. It’s all in a bid to cater to users who have been complaining about spammy sites appearing in top search results.

While Google reports it makes approximately 500 changes to its algorithm a year, this one’s significant, and SEO types, business owners and users will likely notice changes.

“This update is designed to reduce rankings for low-quality sites — sites which are low-value add for users, copy content from other websites or sites that are just not very useful,” Amit Singhal, a Google employee, and Matt Cutts, who leads Google’s anti-spam squad, wrote in a Google blog post. 

“At the same time, it will provide better rankings for high-quality sites — sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

Black Hat SEO Versus White Hat SEO13082555641

The updated algorithm, which is initially being unleashed in the US, reinforces the need for marketers and business to avoid black hat SEO techniques — illegal tactics that can get websites penalized or de-listed from search engines altogether.

Just days ago, The New York Times reported on how J.C. Penny was caught with thousands of unrelated websites in the retail industry linking to their site — a common black hat SEO maneuver. While the US-based retailer enjoyed a brief boost in online presence and sales, it’s now in Google’s bad books.

The short-term gains of black hat SEO do not make sense for a business seeking long-term success. Businesses are better off developing links from reputable websites and communities through quality web content, including articles and press releases, as well as videos and social media.

White hat SEO involves hard work and dedication, which can reap enduring results. Black hat SEO is risky business, with a high potential for unhappy endings. J.C. Penny learned this the hard way.

Read More

Digg!!Facebook!AddThis Social Bookmark Button
SEO Defined - What Google won't tell you!




SEO Defined - What Google won't tell you!

March 24, 2011 Bloggies by Administrator

Did you ever wonder how Google indexes things, and how it valuates websites?

Here is few facts that are not available to public:

How Google is scraping web?

Lets divide Google Spiders (bots to index content of WWW) to 4 main groups.

1) Backlinks Spiders - this creatures are most basic ones, and came out from initial creation of Google. They are somehow Highway builders for other Spiders. They are build like viruses - they will follow link, then they will scrape for more links, and multiply themselves, and follow all links from that page. They love to scrape code and backlinks. They will send back to server information about and backlinks (if they don't see any value in they will scrap some random content, but that will give you ---). If they will catch you on using dead link, they will give you penalty, as you will kill one of them. Google hates when you kill their spiders.
Those spiders have limited ability to reproduce themselves as well (limited by server abilities), that explains why some websites get most pages indexed in one move, and some just 1-2 pages. (it is possible there are 2 spiders - one that scrapes for backlinks, and one for meta, impossible to determine).

2) Content Spiders - they are creation of first Google evolution. They are scraping all websites content. There is some insight that they can actually understand, and recognize text. They will know if article is spun, if article is about something, that is related to your keywords/website. They are "heavy" spiders, and require outstanding amount of resources. They use Backlink Spider highways to move around web. They love constantly changing content. They are so heavy, so they don't want to be forced to scrape same content every time they will enter website. If you will delete one of the links after backlinks spider will index you, but before content will get to you, you need to expect huge damage in your SEO, as they are so heavy to run, that Google will issue penalty when you kill them.

3) Picture Spider - another effect of Google Pictures evolution. They love pictures, and they will actually be able to somehow see some of the pictures (not details, but colours, patterns and will recognize poorly adult content) and read pictures related tags to rank them. They move on highways, and use Content Spiders information to find pictures.

4) Video Spider - another experiment from Google. This smart fellas will do some amazing things. As of now they have no ability to see video itself, so they will use Content Spiders information to find video files, and scrape Thumbnail, tags, rating, and comments, to decide whether this is good video. They have been created in YouTube era (why Google needs YouTube? Test test test).

Enough about spiders. Only Google knows how much more crazy creatures they have.

About Google Dance:

What is Google dance?

Simply - every spider from Google family have power to valuate website. Based on that your website will go up, or down.
Lets say on maximum of 1.000.000 points new website will start having "0" points.
After Backlink Spider visit it has counted that you build a lot of authority backlinks, and you have good Onsite 
SEO. So this important Spider will give you 257.654 points. You rank on page 2 after this.
Now content Spider. It found no valuable content. All copy-paste. Content Spider do not like this, it will give you -57.567 points. After few days you dropped to page 17 because of that.
Now another spiders came in, and re-valuate your website. Some will give you +, some will give you -. After 2 weeks of dancing around (you deleted some link, get some crappy backlinks, or Xrumer spam from bad list? -12.867 from Backlink Spider) you will get to page 3. Does this make sense now?

What is Google Sandbox/Penalty?

Whenever you will get more minuses, than pluses, you will get hit by Sandbox (for example Google found out that you have duplicate content farm pointing to your site. Backlinks Spider couldn't find out, but content one would). Penalty? If you will get sudden drop of ranking (for example from 300.000 points to 150.000 points) it means for Google you messed up = penalty. Remember that Google works using script. You can say - it was just 100.000 backlinks from free Xrumer list, its not that much! - Google says: SPAM.

Why there is no 1 Spider To Rule them ALL? Well simple answer = cost. They cant afford to have 1 Million servers. They are dividing work between spiders, so servers can handle this (yes, Google is using regular servers too). Then all information is send back to super network. Amazingly huge network creating one big computer (computer "42?"). This network calculates position based on score you get.

How determine if website has authority?

Simple. Position in Google is determined by this factors:

1) Authority
2) Backlinks
3) Content
4) Other less important stuff.

What is authority?

Its simple:

+ 4 1

Is Page Rank any important anymore?

Yes and No. Pagerank was original way to determine authority. It is not anymore, as it is using old, and not updated ways to do this. It is using simplest way of 2 + 3 = 1 to determine its value without including +4 and partially new "3" updates in this, so it will be highly inaccurate.

Is No Follow any good?

It is, but not as much as Do Follow, as it will not be scraped by Content Spider, so it will bring less value, and it wont create Highway for other Spiders (they are not allowed to enter it).

What is hot these days?

800+ words article single directory submission (Ezine is still alive, just don't spam articles to 100 directories, as Google now sees this as SPAM!), high AUTHORITY backlinks, Social Backlinks, Viral Marketing, News Feeds, Closed Websites backlinks (ones requiring payment, or difficult to get in to).

Is Google smart?

People behind it - extremely. I worked for Google (few days only) and I had the possibility to see head programmers. I am pretty sure all head programmers are freaking millionaires. Google as a search engine - no. It is just a script. And every script has a loophole. I found 2 exploits last week alone. Just think outside of the box. How to please all Spiders.
I was working for Symantec long time (Norton Anti-virus) and I know for example that they hired CIA guys (some hacking stuff), and young hackers Prodigy to work for them. Google guys for sure too, and they will be pissed after seeing this 

Perfect Website?

Having around 10 links on each page. External links to authority sites that are anyhow under same category. Few videos, and 2+ pictures for each articles. 80% of unique content. Some big citations. Only naturally looking backlinks. Some backlinks poiting pictures. 50% of inner pages have relevant backlinks poiting to it. Maximum of 40% of forum backlinks, 30% of blog backlinks, 20% profile backlinks. Minimum of 30% authority backlinks. Articles/content of various word count. From 300+ all up to 1.500+. Minimum 30-40 words of unique content on every page.

How accurate is this info?

5 years of "no bullshit" tests. I never read any 
SEO ebook. In my life I read only 2 ebooks (one about HYIP and one about Mastercard today - waste of time BTW), so my mind is clear. No Guru bullshit, or Google blog. Just raw information filtered and exposed.

Why Google wont give this info away?

Would you shoot yourself in foot? It is OBVIOUS they will keep in secret all the juicy info, and feed people with lies.


Disclaimer: Only Google knows whats the story about their Algorithm. This data is just raw expression to help you understand whats going on around. We cant have accurate number of "points", or Spiders names etc. Also sorry for all typos, its late here. Enjoy!  

Thanks goes to ExtraWinner from BHW for this great post. 

Read More

Digg!!Facebook!AddThis Social Bookmark Button
How to make Google Love your Articles




How to make Google Love your Articles

March 16, 2011 Bloggies by Administrator

The fundamental element of SEO article writing is keywords. Internet surfers type these words or phrases into search engines and are redirected to corresponding articles containing those relevant words or phrases. This article, for instance is chiefly concerned with SEO writing. It contains keywords and phrases such as “SEO Articles” and “SEO Content”. These are examples of the phrases surfers may type into a search engine to find this article and thus the corresponding website of its location.

To ably take advantage of keywords however, it must be understood that each web surfer searches differently. To find the content in this article for example, a surfer may type keywords which are entirely different from those which are anticipated. This problem is capably solved by the use of keyword tools. There are many such tools available such as the Google Adwords Keyword Tool or Overture. These tools generate alternate words/phrases and rank them by popularity, competition, etc. The Google Adwords Keyword Tool, for example, generates alternatives such as “SEO Keywords”, “Web Content” and “Website Content”. To maximize on this, the article must be kept succinct and centered on relevant alternatives. It should be noted however, that each of these tools uses its own unique terms. As a general rule however, keywords that are popular and attract low competition are the best choices.

Keyword density refers to the number of times a keyword is used in an article compared to the total number of words in the article. It is true that it is important that keywords are adequately repeated. On the other hand however, they should not ‘saturate’ the article. Articles laden purely with keywords and not quality content are written solely for the purpose of driving traffic to a website. This does not represent true SEO writing and should be avoided at all costs. This means that the keyword density should be somewhere around 4%.

It is extremely important that SEO articles are kept simple and follow a particular format that is coherent and easily understood. The normal and recommended format is: Introduction (one paragraph) Body (two to four paragraphs) Conclusion (one paragraph)

These elements should be easily identifiable and coherent. Furthermore, the normal word count for SEO artless is 300-600 words. If the article is shorter, the point may not be effectively conveyed. If it is longer also, there is the risk of boring the reader. If the aforementioned techniques are implemented when writing articles for the web, it is easily assured that the ‘traffic-pull’ ability of any article will drastically increase. As such, the corresponding website will have a much greater chance of being found on the web.

Read More

Digg!!Facebook!AddThis Social Bookmark Button
Keyword Research Made Easy




Keyword Research Made Easy

March 15, 2011 Bloggies by Administrator

 Keyword research is a vital task that many new webmasters ignore. Basically, a proper KW Research is what differentiates a successful website from a poor one.
The keyword research process can be broken down into the following phases:

Phase 0 - Demolishing Misconceptions
Phase 1 - Creating the list and checking it twice
Phase 2 - Befriending the keyword research tool
Phase 3 - Finalizing your list
Phase 4 - Plan your Attack
Phase 5 - Rinse, Wash Repeat

Phase 0 - Demolishing Misconceptions

Over the years, we've had the opportunity to work with a wide array of wonderful clients. And as different and diverse as their sites and the individuals running them may have been, many had one thing in common: they were self-proclaimed keyword research mavens right out of the gate.

Or so they thought.

One of the most common misconceptions about conducting keyword research for a search engine optimization campaign is the belief that you already know which terms a customer would use to find your site. You don't. Not without first doing some research anyway. You may know what your site is about and how you, the site owner, would find it, but it's difficult to predict how a paying customer would go about looking for it.

This is due to site owners evaluating their site through too narrow of a lens, causing them to come up with words that read like industry jargon, not viable keywords. Remember, your customer probably doesn't work in the same industry that you do. If they did, they wouldn't need you. When describing your site or product, break away from industry speak. Your customers aren't searching that way and if you center your site on these terms, they'll never find you.

Another misconception is that generic or "big dollar" terms are the most important for rankings, even if the term you're going after has nothing to do with your site. Imagine a women's clothing store trying to rank for the term "google". Sure, thousands of searchers probably type that word into their search bar daily, but they're not doing it looking for you. They're looking for Google. Being ranked number one for a term no one would associate with your site is a waste of time and money (and it may get you in trouble!). Your site may see a lot of traffic, but customers won't stick around.

Phase 1 - Creating the list and checking it twice

The initial idea of keyword research can be daunting. Trying to come up with the perfect combination of words to drive customers to your site, rev up your conversion rate and allow the engines to see you as an expert would easily give anyone a tension headache.

The trick is to start slowly.

The first step in this process is to create a list of potential keywords. Brainstorm all the words you think a customer would type into their search box when trying to find you. This includes thinking of phrases that are broad and targeted, buying and research-oriented, and single and multi-word. What is your site hoping to do or promote? Come up with enough words to cover all the services your site offers. Avoid overly generic terms like 'shoes' or 'clothes'. These words are incredibly difficult to rank for and won't drive qualified traffic to your site. Focus on words that are relevant, but not overly used.

If you need help brainstorming ideas, ask friends, colleagues or past customers for help. Sometimes they are able to see your site differently than the way you yourself see it. Also, don't be afraid to do some competitive research. What words are they targeting? How can you expand on their keyword list to make yours better? It's okay to get a little sneaky here. All's fair in love and search engine rankings.

Phase 2 - Befriend the keyword research tool

Now that you have your list, your next step is to determine the activity for each of your proposed keywords. You want to narrow your list to only include highly attainable, sought-after phrases that will bring the most qualified traffic to your site.

In the early days of SEO, measuring the "popularity" of your search terms was done by performing a search for that phrase in one of the various engines and seeing how many results it turned up. As you can imagine, this was a tedious and ineffective method of keyword research. Luckily, times have changes and we now have tools to do the hard part for us.

By inputting your proposed keywords into a keyword research tool, you can quickly learn how many users are conducting searches for that term every day, how many of those searches actually converted, and other important analytical information. It may also tune you in to words you had previously forgotten or synonyms you weren't aware of.

There are lots of great tools out there to help you determine how much activity your keywords are receiving:

Wordtracker: Wordtracker lets you look up popular keyword phrases to determine their activity and popularity among competitors. Their top 1000 report lists the most frequently searched for terms, while their Competition Search option provides valuable information to determine the competitiveness of each phrase. This is very useful for figuring out how difficult it will be to rank for a given term. It may also highlight hidden gems that have low competition-rates, but high relevancy.

WordStream: WordStream offers a suite of keyword research tools for use in pay-per-click marketing and search engine optimization initiatives. They also provide powerful fee based tools to help you organize your keywords and increase your profitability.

Google AdWords Keyword Tool: A free tool that should be part of everyone's arsenal.

Google Suggest: Google Suggest is a great way to find synonyms and related word suggestions that may help you expand your original list. Just start typing your search term and you'll see a drop down list of related terms. Again, another way to locate synonyms you may have forgotten.

Keep in mind that you're not only checking to see if enough people are searching for a particular word, you're also trying to determine how competitive that phrase is in terms of rankings.

Understanding the competition tells you how much effort you will need to invest in order to rank well for that term. There are two things to pay attention to when making this decision: how many other sites are competing for the same word and how strong are those sites' rankings (i.e. how many other sites link to them, how many pages do they have indexed)? Basically, is that word or phrase even worth your time? If it's not, move on.

While you're testing your new terms, you may want to do a little housekeeping and test the activity for keywords your site is already targeting. Keep the ones that are converting and drop the losers.

Phase 3 - Finalizing your list

Now that you have your initial list of words and have tested their activity, it's time to narrow down the field and decide which terms will make it into your coveted final keyword list.

We recommend creating a spreadsheet or some other visual that will allow you to easily see each word's conversion rate, search volume and competition rate (as given to you by the tools mentioned above). These three figures will allow you to calculate how viable that term is for your site and will be a great aid as you try and narrow down your focus.

The first step in narrowing down your list is to go through and highlight the terms that most closely target the subject and theme of your web site. These are the terms you want to hold on to. Kill all words that are not relevant to your site or that you don't have sufficient content to support (unless you're willing to write some). You can't optimize for words that you don't have content for.

Create a mix of both broad and targeted keywords. You'll need both to rank well. Broad terms are important because they describe what your web site does; however, they won't increase the level of qualified traffic coming into your site.

For example, say you are a company that specializes in cowboy boots. It may be natural for your site to focus on the broad search terms "boots" and "cowboy boots". These words are important because they tell the search engines what you do and may increase your visitors, but the traffic you receive will be largely unqualified. Customers will arrive on your site still unsure of what kind of boots you sell. Do you offer traditional cowboy boots, stiletto cowboy boots, toddler cowboy boots, suede cowboy boots or women's cowboy boots? By only targeting broad terms, customers won't know what you offer until they land on your site.

Targeted terms are often easier to rank for and help bring qualified traffic. They also make you a subject matter expert to the search engines, since the targeted terms strengthen the theme created with the broader phrases. Sticking with our example, targeted terms for your cowboy boots site may be "men's cowboy boots", "blue suede cowboy boots", "extra-wide women's cowboy boots", etc. Broad search terms may bring you the higher levels of traffic, but it's targeted, buying-oriented terms like these that will maximize conversions.

Phase 4 - Plan your attack

So you made your list of about 10-20 highly focused keywords, now what do you do with them? You prepare them for launch!

Chances are, if you did your keyword research right, at least some of the words on your list already appear in your site content, but some of them may not. Start thinking about how many pages you'll need to create to support these new words, and how and where your keyword phrases will be used.

We typically recommend only going after three or four related keywords per page (five if you can balance them properly). Any more than that and you run the risk of diluting your page to the point where you rank for nothing. Make sure to naturally work the keywords into your content and avoid over-repetition that may be interpreted as spamming. Your content should never sound forced.

Your on-page content isn't the only place where you can insert keywords. Keywords should also be used in several other elements on your site:

Title Tag
Meta Description Tags
Alt text
Anchor Text/ Navigational Links
You've spent a lot of time molding your keywords; make sure you use them in all the appropriate fields to get the maximum benefit.

Phase 5 - Rinse, Wash, Repeat.

Congratulations. Your initial keyword research process is behind you. You've created your list, checked it twice, made friends with the keyword research tools and are now off to go plan your attack. You're done, right?

Unfortunately, no. As your customer's and your site's needs change over time, so will your keywords. It's important to keep monitoring your keywords and make tweaks as necessary. Doing so will allow you to stay ahead of your competition and keep moving forward.

Read More

Digg!!Facebook!AddThis Social Bookmark Button