On the occasion of the SEO Campus , on March 28, Vincent Courson took part in the Q&A exercise alongside Olivier Andrieu, author of the blog Abondance. The objective: to come back to the latest Google announcements and clarify certain points around the behavior of the search engine. 5 questions caught our attention: the announced obsolescence of the rel next / prev tags, the imminent evolutions of the Search Console, the inclusion of ux criteria in the indexing and the usefulness of the link detox.
HAVE REL NEXT / PREV TAGS BECOME UNNECESSARY?
Vincent Courson makes a welcome clarification after the wave of discontent following the Google Webmaster announcement:
Spring cleaning! As we evaluated our indexing signals, we decided to retire rel = prev / next. Studies show that users love single-page content, aim for that when possible, but multi-part is also fine for Google Search. Know and do what's best for * your * users! - Google Webmasters March 21, 2019
The SEO community did not necessarily welcome the news:
Yesterday, Google announced that it doesn't use the pagination markup rel = prev / next at all and hasn't for years. Curious, because they've been advocating it until recently. Here's what it means for working with paginated content, including our view on it. - Yoast March 22, 2019
Vincent Courson therefore apologized for the poor communication and then confirmed the news: the improvement of the crawl makes the rel rext / prev tags obsolete because Google is able to locate paginated content on its own. Also, should we get rid of it? For Vincent Courson, it would be unfortunate not to use them anymore. Indeed, they continue to facilitate exploration and crawl with Google Bot. The rel netx prev tags are also useful for other search engines – including Bing – browsers and accessibility.
Answer: we keep them!
SEARCH CONSOLE EVOLUTIONS
The Search Console is going to change starting March 10. Vincent Courson gave us a little overview of the developments to expect.
CONSOLIDATION OF DOMAINS
Announced on March 25 and effective on April 10, the consolidation of domains will bring together all the variants (http, https, etc.) of a domain within a single interface. The key is simplified data monitoring.
With Domain Properties in the #SearchConsole , data can now be easily aggregated for ALL URLs in a domain, regardless of: - the protocol - the absence or not of - Vincent Courson March 25, 2019
This consolidation will take place automatically on April 10 if you have already set up DNS validation.
CONSOLIDATION OF THE PERFORMANCE TOOL
This update , announced at the end of March, is more problematic: traffic data will be reported on canonical URLs whereas previously they were assigned to the exact URL, presented to the Internet user.
The #SearchConsole URL inspection tool can now tell what the canonical page URL is for any site, not just yours. Suddenly, the search operator "info:" takes his bow because little used. - Vincent Courson March 26, 2019
This change impacts the monitoring of mobile pages as well as AMP pages , both canonicalized. Their own data will always be accessible but you will have to use the search dimension filters.
IS THE DISAVOWAL OF TOXIC LINKS AN OBSOLETE PRACTICE?
Vincent Courson affirms that the disavowal of toxic links is of no interest. Google’s algorithm is now able to discriminate between them when calculating Page Rank. Also, if a dishonest site linked to your site, said link would not be taken into account and would have no impact on the authority of the targeted site .
Choké disappointed? This update of #penguin is not necessarily bad news: it allows in particular to neutralize the tactics of #negativeseo #blackhat for. - Search Foresight March 28, 2019
Interviewed by JDN, he specifies that the only utility he sees in a disavowal file is part of a manual penalty:
“[the disavowal file in this specific case allows] to give an opportunity to a webmaster which tries to clean its backlinks profile to indicate to Google that it cannot get certain links removed. ”
DOES UX IMPACT SEARCH RESULTS?
While SEO experts are constantly wondering about the possible use in the ranking algorithm of usage criteria such as the click-through rate in the results pages, the dwelling time or the pogo analysis sticking, Vincent Courson reaffirms that Google does not take into account this UX data at the level of an individual page. In contrast, the algorithm makes use of behavioral data at the query level. On the terms of this use, unfortunately, Vincent Courson remains unclear … For more details, we invite you to read Philippe Yonnet ‘s article on the use of CTR in Google’ s algorithm .
If this Q&A did not give rise to shattering revelations, this exercise illustrates Google’s step towards a policy of transparency and better consideration of the community of SEO experts. This intervention at the SEO Campus made it possible to clarify certain elements related to current events in the sector.