It is not recommended to use too many keywords basically because:
- It is easily detectable and you will sink in the SERP.
- It is unpleasant to read texts sobreoptimizados, you can get the conversion on the web or even the CTR of your (with which the public is not even your domain)
I refer to the, where you can find the reason of your question is on page 15 of the PDF.
The search engines (nominally Google) have for years of algorithms able to detect linguistic patterns, to which must be added incrementally good capacity crossߛlanguage.
With this, to detect the abuse of words that, statistically, (and in natural language) do not occupy their usual space is easy (and I say easy to Google!).
There are specific algorithms to detect poor-quality content, each specializing in specific topics, such as:
- Mayday: duplicate content (pre-2010).
- Panda: search for content poor in all their varieties, one of the most generic is the sobreoptimización of texts.
- Penguin: it has seen many revisions up to become part of the core of GoogleBrain, and working to real-time page level, not domain as before (and it was precisely this that made him so dangerous); it is responsible for the fraud on the links, i.e. links unnatural to boost websites to the discretion of the webmaster. I mention it because one of the perfect places to put keywords aggressively is in the text of the links.
One of the examples ofmore persecuted and typical for its great boom between 2009 and 2012 were the blogging ghost, so-called because the tactic in vogue was to create a network of blogs themed add-on that formed the basis of a strategy of link building to support the site business of the company will shift or clients SEO. Obviously the main expenditure was the content and this used to be articles spineados and semiplagios.
(Spinear: take an original text and multiply it by tens or hundreds of variants through the substitution of verbs, nouns, prepositions, tenses, synonyms, and changes in singular/plural or gender).
One of the main factors to enhance the potential SEO of these texts (which we assume poor) is to work with high keyword density, that is to say the ratio (expressed in %) between the total count of words in the article and the number of occurrences of a word given key.
Today, the concept of “keyword density” and its derivatives is seen as somewhat obsolete in its meaning strict, due to context semantizado work with the web content.
No longer is indispansable to repeat a word (or phrase) concrete a number of times in certain places of the text.
Now use concepts close semantically, words that mean the same as the specific keyword that interests us or that complement it.
Of course we still need to use the concept of key word, but not in terms of abuse and excess, but to focus on our landing pages and other content.
This question originally appeared on Quora, a place to acquire and share knowledge, training people to learn from each other and better understand the world. You can follow Quora on Twitter and Facebook. More questions:
The opinions expressed are the sole responsibility of their authors and are completely independent of the position and the editorial line of Forbes Mexico.