Why It’s Better if Your Website Ranks for Fewer Keywords
December 4, 2017//1,026 words
Wait, you’re telling me that I should try to rank for fewer keywords?
Isn’t that SEO blasphemy?
Maybe, but hang with me for a minute.
I propose that if your website ranks for fewer keywords, you will see an increase not only in your conversion rates, but your click-through rates and website traffic from organic results as well.
Think I can support my reasoning?
Read on …
Ranking for fewer keywords will improve how the user interacts with your website
Now, let me make one thing clear that will help with the rest of this post. When I say “fewer keywords” I simply mean having a strategy that includes say 5-10 of the best quality keywords relevant to your website and not being concerned with trying to rank for as many as possible.
If you check your Google Search Console account on a regular basis, it can be tempting to review the Search Analytics screen and get excited about this number growing …
This particular sites currently ranks for 999 keywords and key-phrases.
However, upon further review, I can see that the keywords towards the end are highly irrelevant to the topic and they are so far back in the rankings that they hardly even matter:
In fact, it’s likely that the rankings for these words will not improve too much either.
Now, let’s focus on the user’s interactivity with your site, or what I like to refer to as the User Attention Rate.
If someone ventures to your website from one of these many keywords located towards the end of your list, what are the chances that they are going to find what they are looking for?
For example, if you have a quality site that Google has recognized, you could rank in the couple-hundreds for a word such as basketball just because you mentioned it once or twice on a page even though your website is about a candy store.
Now, I’d bet that if someone actually clicked on that search result for basketball and visited your site, they would quickly leave.
Imagine how many times this could happen if you rank for 999+ keywords.
Your bounce rate would skyrocket and time on page will plummet. Not a good combination for a strong User Attention Rate.
This is why I suggest (URGE) that you focus on a handful of keywords (depending on your business and content) and don’t get caught up in how many keywords you rank for altogether.
Ranking for fewer keywords will keep your webpages from competing and hurting each other
Let’s stick with the same case study.
After I discovered that this particular website ranked for 999 keywords, I continued my audit and checked for duplicate content (siteliner.com).
I was not surprised with what I found …
Out of less than 100 pages, 58% of the content was duplicate.
In other words, less than half of the content on this website was unique or original.
What’s the primary culprit? No optimization of the robots.txt file.
And by optimization I mean blocking search engines from crawling certain pages.
Now, these 2 things go hand in hand: not blocking highly-duplicated pages and ranking for many unrelated keywords.
Naturally, we want to rank for every word under the sun, and if we block a page that just means it’s one less page that we can be found for!
However, this is what I propose …
Keep those duplicated pages if they improve your user experience, but block search engines from finding them. They are only dragging you down!
In the case study, I blocked nearly 30 pages from search engines and lowered the duplicate content from 58% to less than 5%.
Ranking for fewer keywords will help you rank higher for the best keywords
And this is the result …
Your website will, in turn, be more focused on the important keywords and begin to rank higher for these words.
Your highly-similar pages won’t compete.
Users won’t be bouncing off your website at so high a frequency.
Google will understand better what your website is all about.
You will also have a clearer strategy which will help you in future optimization (which personally, I find to be very helpful).
Try it out. See where you stand. Go to your Google Search Console to see how many keywords you rank for and then check out your duplicate content percentage on Siteliner.
If they are both high, chances are that you might need to do some website cleaning.
Start in your robots.txt file.
I hope you enjoyed this post and found something useful that you can take with you. This was relatively shorter than my other posts but I trust it will give you something to work on.