The recent controversy over the update of Moz’s DA (Domain Authority) indicator gives me the opportunity to recall a somewhat painful truth: beware of what SEO tools tell you.
We are going to try to see why certain indicators are biased, to the point of not reflecting reality, and how to manage to use the information extracted from these tools despite everything, even though they are more than imperfect.
MOZ CHANGES THE CALCULATION OF ITS DOMAIN AUTHORITY
Moz recently changed the calculation of his Domain Authority score: many site owners have seen their DA change drastically from one day to the next. But the problem is that since Google no longer communicates the Pagerank of the Toolbar, only third-party tools (like Moz, Majestic, Ahrefs) communicate “scores” supposed to reflect the importance of the sites.
But these scores are to be taken with enormous tweezers: the data collected by these tools is not the same as that collected by Google: all these tools crawl the web, but not in the same way as Google, in particular not in such a way. exhaustive, and most importantly, do not process data like Google. For example, because you don’t know which links do not transmit Pageranks, the substitute “Pageranks” that are the DA from Moz, or the Citation Flow from Majestic, are very different from the Google rating.
In the case of DA, Moz also uses a secret recipe to produce a composite score from criteria independent of the requests, and supposed to reflect its ability to position itself in front of the competitors, the other criteria being equal.
But Moz’s “formula” being somewhat arbitrary, a simple change in this formula (like in the recent update) gives very different scores. So should we trust these indicators?
Yes and no.
Yes, because they give information making it possible to compare websites, to see if we are progressing on these criteria etc …
But no, because these data are extraordinarily biased, do not reflect what is really happening at Google, and must be considered with a certain distance…
ANOTHER EXAMPLE: VISIBILITY INDICES AND TRAFFIC ESTIMATES
Other SEO tools give you information on your visibility, or estimates on the SEO traffic generated by your competitors and yourself. The methodology to achieve these scores is known: we take a sample of requests, we retrieve the results pages, and then we establish scores taking into account the presence of the urls of the sites in the results, their position, the frequency. of the request and we can go so far as to deduce the traffic generated by taking into account the expected CTR according to the positions.
These tools, again, are not without bias.
For example, the results depend a lot on what the keywords are in the samples: it is not uncommon to see visibility of a site decrease according to one tool, and increase according to another! What is more, if the traffic to your site is generated mainly by “long tail” requests, these tools become irrelevant because the keywords monitored are generally short or middle tail keywords.
Some of these tools also have a bad habit of changing their samples without much care: if your visibility seems to have increased compared to the beginning of the year, it may be because your SEO has improved, but it is maybe also because the sample of keywords used has changed!
As for the quality of the traffic predictions: all you need is the indicators that your favorite web analytics tool tells you to understand that there is a problem.
SHOULD WE THROW SEO TOOLS IN THE TRASH?
The tools that produce their own SEO indicators are still interesting to use, because they help make diagnoses and imagine priority action plans. They also have an undeniable educational side for players who are new to SEO.
But an SEO specialist must absolutely remain lucid about the quality of the information produced, on its potential biases, and keep in mind that the scores produced do not come from Google, but reconstituted with formulas that are sometimes arbitrary and in addition kept secret. .
We can still use these tools, but on the condition of crossing their results to detect if there is a problem with the quality of the information, and to confront their predictions with reality to avoid being misled by them. Taking a close look at Search Consoles data can also prevent you from making the wrong diagnosis.
In short, all these tools are useful on a daily basis, as long as they use their information with all the necessary hindsight.