All WordPress users are accustomed to various SEO-plugins like Yoast or All-in-One SEO pack. They offer more or less similar set of functions that allegedly supposed help users with optimizing their pages and blog posts for Google search. In most cases you could see such recommendations for on-page SEO as increasing/lowering keyword density in your text, using your target keyword in H1 and H2 titles, placing the main keyword at the beginning of the H1 title etc.
But are all those manipulations still essential for your page to get higher in Google SERPs?
Does this machine-suggested optimization still work?
Latest studies show that there’s no strong correlation between, say, the number of an exact-match keyword or its presence in the first paragraph of the copy and the position of the page on SERP. It means you can’t be sure you get high rankings even if you set up all these factors perfectly.
Just take a look at these top search results for the keyword “blogging tips”:
You must have noticed that the No1 result for this keyword doesn’t have its exact match in the title and the URL. Moreover, the entire copy doesn’t contain the keyword “blogging tips.” But Google somehow managed to understand the entire topic of the article and rated it above all other results containing the exact match keyword (even including the website with domain “bloggingtips.com” domain name).
Clearly, in 2017 Google doesn’t consider exact-match keyword as a strong ranking factor. With the introduction of Hummingbird algorithm, the focus shifted from technical optimization to the overall topic relevance to user search queries and their trust to the resource.
Trust is that base of strong relationships between Google (and thus, users) and websites. It includes many signals like domain authority, backlinks to the page, content relevance etc. In this light, it doesn’t matter how many times you’ve used your keyword or whether you included it in the very beginning of the title.
Let’s check out what factors we used to heavily rely and checked on SEO-plugins while creating content for the website don’t seem to have a high impact on page ranking.
Traditionally, SEO-plugins require you to properly optimize page title and H1, META-description, set the keyword at the beginning of a title and in the first 100 words of the text. It is recommended to put the keyword in the ALT-tags of images and more than one subheading as well.
Keyword in META-description
Let’s take a look at the Top-10 on “blogging tips” keyword in SERP. Since Google often shows in snippets any other text instead of META-description (e.g. the first 200 characters of the post itself or a piece of a text where the keyword was mentioned), I manually checked the source code for each one of the Top-10 results concerning their existing META-descriptions.
Out of 10 results, only 4 pages have META-description with the exact-match keyword in it. Other 6, including the No 1 result, either have no description at all or include semantically-close keywords, e.g. “blogging advice.”
Keyword at the beginning of a title
Well, check out the screenshot below. I tried a keyword “write a guest post” for this search query (omitting Google Ads and Featured Snippet results). I think it’s obvious from what you see there that the placement of the keyword at the beginning of a title has nothing to do with the page position in SERP.
Keyword in the first paragraph
I checked the first results on a “write a guest post” results page. The Top-5 pages (above the fold) didn’t contain the keyword in the first 100 words. Moreover, all Top-5 results, except No 2, didn’t contain the exact-match keyword at all).
Keyword in the subheadings (H2)
Again, I scanned through all the Top-10 SERP results for the “write a guest post” keyword. And guess how many of them included this keyword into at least one subheading?! None! Not even the very top post contained the keyword. It seems that this practice is also a long-gone one.
Keyword in image ALT-tag
Googlebot (as well as all other search engines) crawlers understand words, not images. That’s why descriptive image titles and ALT-tags are helpful for crawlers allowing them to “see” what does this photo relate to and thus put it into search results.
Whereas ALT-tags are used to be displayed in case of the image on your page doesn’t load. It is done to give a user the idea of what should have been displayed there. But there is no clear evidence that ALT-tag stuffed with keywords can help your page fly up in Google rankings.
A few words about domain weight
Traditionally, Domain Authority (Moz’ definition) or Domain Rating (Ahrefs’ definition) was considered as a vital ranking factor. Modern researches (and my own observations) show that DR doesn’t impact a page ranking, at least not directly. And moreover, there is no evidence that the Domain Age has a lot to do with high ranking positions.
Thus, if we take a look at SERPs for one and the same keyword in a week or two, we may see different websites on first positions. And often websites with younger domains but stronger backlink profile can jump up to Top-3. Just check out the screenshot below:
Despite the fact that the Huffington Post website has the highest DR among all the Top-10 websites, it sits the No 7 consolation step.
Ahrefs has another metric for weighting the website’s authority – it’s the URL rating. It measures the strength of a specific URL in terms of its backlink profile. And it seems that Google pays more attention not to a domain itself but to each page URL. Perhaps, each URL transfers some of its weight to the entire domain. In any case, the Domain Rating (and even more so the Domain Age) is not the thing that helps you rank higher.
There are more factors your SEO plugins usually require you to setup, e.g. the number of exact-match keywords you use across your copy. That’s right, the plugins won’t detect closely-related ones! Will this number strongly affect your page ranking? Absolutely no. In the Top-10 results for the “write a guest post” keyword hardly 3 of them contained this keyword, while others included it no more than 1 time for the entire copy.
Google Ranking factors that still count
Then, what are those ranking factors that should help you get in the Top-10 or even Top-3 of Google? With a Hummingbird algorithm that focuses on semantics, not the exact-match automation, all Google’s updates are aimed at pleasing users, not SEO-specialists. If we take a look back, we will notice these major Google algorithm updates:
- Google Panda (introduced in 2011) – lowers the ranking of pages with poor content;
- Google Page Layout (introduced in 2012) – affects pages with low-quality design, lots of pop-ups, dazzling banners and ads;
- Google Penguin (introduced in 2012) – penalizes websites that purchase links and have lots of spammy links;
- Google Pigeon (introduced in 2014) – focuses on ranking relevant pages for local search;
- Google Mobile Friendly (introduced in 2015) – aimed at giving a boost to mobile-oriented websites.
Another significant element that has huge impact on page ranking, as Google revealed in one of its Q&A sessions, is Google RankBrain. This is a whole new approach to ranking that uses machine learning technology to evaluate pages and providing the most relevant results to user’s search.
Quickly looking through these algorithm updates, you can notice that all of them are aimed at providing the best experience to users in terms of:
- relevance and quality of the content
- and the website use (UX).
And these can be considered the strongest ranking factors that you should optimize your content to.
When we speak of quality and relevance of the content, we usually consider the benefit it provides to users, its visual appeal and its length. For long years SEO plugins recommended users to write pieces longer than 300 words. Some time ago everyone was obsessed with so called “long-form content.” But will the length of content help you rank higher?
Longer forms are better for providing users with high-quality articles and thus can rank much better than shorter 200-word copies. A study from Neil Patel revealed that most Top-1 results in Google are over 2000-words copies. Users share and link to such copies more often what signals to Google that people like and trust long-form content more than shorter forms.
However, a good-old article on Slate has proved some years ago that people on the Internet don’t read before tweeting. And most of them never scroll to the end of articles “below the fold.” Rand Fishkin from Moz has also added to this opinion claiming that long content doesn’t equal good content (and high-ranking content). Guess, it’s the matter of relevance of that content to users.
Relevance means that you’ve created a piece that provides users with exactly the content they searched for. And here’s where keywords come in handy. Long-tail keywords describe the content more precisely. And pages with more relevant content beat those with higher DR or with multiple referring domains. Just take a look at this:
Could you ever imagine that a page with only 3 referring domains could outrun a Wikipedia page with almost 1K referring domains?! But the answer is pretty simple: the Wikipedia page is dedicated to the main coon breed in general, while the page above it speaks about white main coon specifically. It is seen by Google as a more relevant page and placed higher in SERP.
Despite the fact that sometimes the content relevance beats backlinks, they stay one of the strongest ranking factors that prove its power every time. As I said earlier, Google considers trust from users as one of the main ranking factors for the websites and pages. To put it simply, it’s like elections, where users cast their votes for the pages by linking to them. The more users linked back to the resource, the higher the Google puts it in SERP.
And of course, we speak here of high-quality backlinks from “good” websites. Buying links in bulk is a slippery slope to grey-hat SEO. Many people still use it, but it won’t let your website get to the top and stay there. Google is strongly against such practices and it has its Penguin to watch over it.
3. Mobile-friendliness and page loading speed
These are two ranking factors coming hand-in-hand. Google pays close attention to all the websites turned face to mobile devices users. It claimed that after January 10, 2017, pages that fail to present mobile-accessible content may spot significant drop in rankings. There are no major shifts however for now. But you cannot be too careful when it comes to Google’s algorithms.
One should take care of making the entire website fast-loading and responsive, not just particular pages. But it would be wise checking each new page out in Google Webmaster Tools and see how it loads and looks on smartphones screens. Most pages can be optimized for fast loading with cleaner code, “lightweight” images, responsive page structure etc.
Responsive pages on mobile should also avoid pop-ups, standalone interstitials that force users to watch advertisings or that occupy the most of the “above-the-fold” space.
4. A few more technical SEO factors
Aside from the above-mentioned factors, there are a few technical aspects that can affect page rankings and thus should be considered.
- HTTP to HTTPS: Back in 2014, Google claimed using HTTPS as a ranking signal. Since you still can see many HTTP-websites on the first pages on SERPs, there is a small chance that immediate switch to HTTPS will help your website rank higher. But it will increase users’ and Google’s trust to your website.
- URL structure: As you can see from the examples above, exact-match keyword in URL has little to do with high rankings. But it doesn’t mean you should leave a generic “yourwebsite.com/blog/45658.html” URL. Google bots crawl all URLs equally. But the search engine giant recommends using simple URL structure that is clear to users to improve UX. So, make a URL as more descriptive as it’s possible to help users to understand where it leads them. And avoid multiple-folders URLs for the same UX reasons.