Understanding Keyword Density In Web Marketing
Professional Webmasters Community :: Webmaster Speak :: General Marketing Forum :: Search Engine Optimization
Page 1 of 1
Understanding Keyword Density In Web Marketing
Keyword density refers to the number of times a keyword appears in any given article. When the Internet first became popular, it was fairly easy to get a high search ranking; users just stuffed paragraphs with selected keywords, or created invisible keywords on the home page. However, the more the Internet grew, the stricter search engines became. Nowadays, the old methods have been blacklisted as “black hat” SEO and using them will immediately disqualify a site from higher rankings.
This is the formula for figuring out keyword density: you divide the number of keywords by the total word count and then move the decimal point up two spaces. Thus a 300 word article with three keywords would have a 1% keyword density. There are websites that still attempt to keyword stuff their articles, going upwards of 5-7%. (Almost everyone agrees anything beyond 10% is just asking for banishment.)
There are differing opinions on how high is too high. Some companies say 4%-7% is okay for a general SEO article, while others say 1%-3% is a better standard. Of course, speaking linguistically, repetition is only as good as the point the article is making. An educated human being can obviously tell the difference between an article that effectively uses repetition for emphasis and coherent connectives, and an article that simply stuffs keywords into every sentence.
In the past, this was the major problem, or so people claimed. How could search robots tell the difference between repetition and keyword stuffing? The answer was simple: major search engines employed human editors as well as search robots (who worked for scrap metal and ethanol) to prevent such injustices from taking place.
The newest development in keyword technology is the “Google Panda” update, which refers to an improvement in the company’s search algorithm. The company is attempting to filter out websites and web pages that publish low-quality, over-duplicated or just plain useless web articles.
This new technology is certainly putting the pressure on online directories, websites and content producers to publish only their best work. This is undoubtedly an attempt by Google to raise the standard of Internet writing, taking it a level nearer to in-print magazines and newspapers.
Now more than ever, the quality of the writing is what gets a company noticed in search rankings. Many websites are now being featured in top 10 searches with low keyword densities by using effective website optimization strategies. While keyword density is still important, search engines (aided by human editors and enhanced search spider technology) are now less concerned with the number of times a word is used, than the way it is used in sentence structure. Rules and technology continue to change, but quality writing is always the goal.
This is the formula for figuring out keyword density: you divide the number of keywords by the total word count and then move the decimal point up two spaces. Thus a 300 word article with three keywords would have a 1% keyword density. There are websites that still attempt to keyword stuff their articles, going upwards of 5-7%. (Almost everyone agrees anything beyond 10% is just asking for banishment.)
There are differing opinions on how high is too high. Some companies say 4%-7% is okay for a general SEO article, while others say 1%-3% is a better standard. Of course, speaking linguistically, repetition is only as good as the point the article is making. An educated human being can obviously tell the difference between an article that effectively uses repetition for emphasis and coherent connectives, and an article that simply stuffs keywords into every sentence.
In the past, this was the major problem, or so people claimed. How could search robots tell the difference between repetition and keyword stuffing? The answer was simple: major search engines employed human editors as well as search robots (who worked for scrap metal and ethanol) to prevent such injustices from taking place.
The newest development in keyword technology is the “Google Panda” update, which refers to an improvement in the company’s search algorithm. The company is attempting to filter out websites and web pages that publish low-quality, over-duplicated or just plain useless web articles.
This new technology is certainly putting the pressure on online directories, websites and content producers to publish only their best work. This is undoubtedly an attempt by Google to raise the standard of Internet writing, taking it a level nearer to in-print magazines and newspapers.
Now more than ever, the quality of the writing is what gets a company noticed in search rankings. Many websites are now being featured in top 10 searches with low keyword densities by using effective website optimization strategies. While keyword density is still important, search engines (aided by human editors and enhanced search spider technology) are now less concerned with the number of times a word is used, than the way it is used in sentence structure. Rules and technology continue to change, but quality writing is always the goal.
Similar topics
» Keyword Density and Keyword Stuffing
» Keyword Density or Keyword Stuffing?
» Keyword Density
» Is Keyword Density Becoming Less Important?
» Things Regarding Keyword Density
» Keyword Density or Keyword Stuffing?
» Keyword Density
» Is Keyword Density Becoming Less Important?
» Things Regarding Keyword Density
Professional Webmasters Community :: Webmaster Speak :: General Marketing Forum :: Search Engine Optimization
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum
|
|