12/08/2022

shermancountycd

Saved By Business

Case Study: How the Cookie Monster Ate 22% of Our Visibility

7 min read

The author’s views are completely his or her personal (excluding the not likely party of hypnosis) and may perhaps not always replicate the sights of Moz.

Last 12 months, the staff at Homeday — 1 of the top home tech corporations in Germany — manufactured the conclusion to migrate to a new information administration method (CMS). The goals of the migration had been, between other issues, increased page velocity and creating a point out-of-the-artwork, future-proof web page with all the needed options. A person of the principal motivators for the migration was to allow information editors to work additional freely in developing web pages with out the aid of developers. 

Following evaluating a number of CMS possibilities, we made a decision on Contentful for its fashionable technology stack, with a superior working experience for both editors and developers. From a complex viewpoint, Contentful, as a headless CMS, enables us to opt for which rendering system we want to use. 

We’re now carrying out the migration in numerous phases, or waves, to reduce the danger of issues that have a huge-scale unfavorable impression. Throughout the to start with wave, we encountered an issue with our cookie consent, which led to a visibility decline of almost 22% within just 5 days. In this write-up I’ll describe the difficulties we have been dealing with during this to start with migration wave and how we settled them.

Location up the very first exam-wave 

For the initially exam-wave we selected 10 Search engine optimisation pages with significant targeted visitors but very low conversion costs. We proven an infrastructure for reporting and monitoring all those 10 pages: 

  • Rank-tracking for most appropriate keywords 

  • Search engine optimization dashboard (DataStudio, Moz Professional,  SEMRush, Search Console, Google Analytics)

  • Typical crawls 

Right after a thorough planning and tests period, we migrated the first 10 Search engine optimisation webpages to the new CMS in December 2021. While a number of issues happened for the duration of the screening phase (greater loading occasions, larger HTML Doc Item Model, and many others.) we resolved to go dwell as we failed to see big blocker and we desired to migrate the to start with testwave before xmas. 

1st efficiency overview

Incredibly fired up about acquiring the 1st move of the migration, we took a glance at the general performance of the migrated webpages on the subsequent day. 

What we saw next definitely didn’t make sure you us. 

Overnight, the visibility of tracked key phrases for the migrated pages reduced from 62.35% to 53.59% — we missing 8.76% of visibility in one working day

As a final result of this steep fall in rankings, we carried out another extensive spherical of testing. Amongst other things we analyzed for coverage/ indexing concerns, if all meta tags were being integrated, structured information, inside inbound links, website page pace and mobile friendliness.

Next overall performance evaluation

All the articles or blog posts experienced a cache date right after the migration and the content material was entirely indexed and remaining study by Google. Moreover, we could exclude several migration danger factors (alter of URLs, information, meta tags, structure, and many others.) as resources of mistake, as there hasn’t been any modifications.

Visibility of our tracked search phrases suffered a further fall to 40.60% more than the following number of days, building it a full fall of practically 22% within just 5 days. This was also obviously revealed in comparison to the competitors of the tracked keyword phrases (listed here “approximated visitors”), but the visibility seemed analogous. 

Data from SEMRush, specified keyword set for tracked keywords of migrated pages

As other migration danger factors additionally Google updates had been excluded as resources of mistakes, it unquestionably had to be a technical problem. Way too considerably JavaScript, low Main Net Vitals scores, or a much larger, a lot more complex Doc Object Design (DOM) could all be likely leads to. The DOM signifies a page as objects and nodes so that programming languages like JavaScript can interact with the website page and adjust for instance type, composition and content material.

Adhering to the cookie crumbs

We had to determine difficulties as quickly as achievable and do fast bug-correcting and lessen a lot more destructive outcomes and visitors drops. We lastly acquired the initial serious trace of which specialized rationale could be the cause when a person of our equipment confirmed us that the quantity of webpages with higher external linking, as well as the number of internet pages with greatest material dimensions, went up. It is important that webpages don’t exceed the highest written content dimension as webpages with a incredibly huge quantity of body information could not be absolutely indexed. Pertaining to the large external linking it is important that all exterior backlinks are trustworthy and applicable for people. It was suspicious that the selection of external back links went up just like this.

Increase of URLs with high external linking (more than 10)
Increase of URLs which exceed the specified maximum content size (51.200 bytes)

The two metrics were disproportionately significant as opposed to the amount of internet pages we migrated. But why?

Just after checking which external links experienced been included to the migrated webpages, we noticed that Google was reading and indexing the cookie consent kind for all migrated webpages. We executed a web site lookup, examining for the content material of the cookie consent, and observed our idea confirmed: 

A site search confirmed that the cookie consent was indexed by Google

This led to numerous difficulties: 

  1. There was tons of duplicated information developed for each site because of to indexing the cookie consent variety. 

  2. The material size of the migrated webpages greatly improved. This is a difficulty as pages with a very massive volume of body articles could not be thoroughly indexed. 

  3. The variety of exterior outgoing back links significantly greater. 

  4. Our snippets instantly confirmed a date on the SERPs. This would recommend a weblog or news post, even though most content on Homeday are evergreen material. In addition, owing to the day showing up, the meta description was minimize off. 

But why was this occurring? In accordance to our services provider, Cookiebot, research motor crawlers accessibility websites simulating a comprehensive consent. Hence, they achieve entry to all material and copy from the cookie consent banners are not indexed by the crawler. 

So why was not this the situation for the migrated web pages? We crawled and rendered the internet pages with various consumer agents, but nevertheless could not uncover a trace of the Cookiebot in the supply code. 

Investigating Google DOMs and looking for a solution

The migrated internet pages are rendered with dynamic data that will come from Contentful and plugins. The plugins include just JavaScript code, and in some cases they occur from a companion. Just one of these plugins was the cookie supervisor companion, which fetches the cookie consent HTML from exterior our code base. That is why we didn’t come across a trace of the cookie consent HTML code in the HTML source information in the very first position. We did see a more substantial DOM but traced that again to Nuxt’s default, a lot more advanced, bigger DOM. Nuxt is a JavaScript framework that we do the job with.

To validate that Google was looking through the copy from the cookie consent banner, we utilised the URL inspection tool of Google Look for Console. We in contrast the DOM of a migrated website page with the DOM of a non-migrated site. In just the DOM of a migrated site, we at last discovered the cookie consent information:

Within the DOM of a migrated page we found the cookie consent content

Some thing else that received our awareness had been the JavaScript information loaded on our aged webpages vs . the data files loaded on our migrated web pages. Our internet site has two scripts for the cookie consent banner, provided by a 3rd bash: 1 to exhibit the banner and grab the consent (uc) and one particular that imports the banner content material (cd).

  • The only script loaded on our outdated webpages was uc.js, which is dependable for the cookie consent banner. It is the just one script we will need in each and every webpage to take care of person consent. It displays the cookie consent banner without indexing the content and saves the user’s decision (if they concur or disagree to the usage of cookies).

  • For the migrated pages, apart from uc.js, there was also a cd.js file loading. If we have a site, the place we want to exhibit much more data about our cookies to the consumer and index the cookie facts, then we have to use the cd.js. We thought that both of those information are dependent on just about every other, which is not appropriate. The uc.js can operate by yourself. The cd.js file was the cause why the material of the cookie banner acquired rendered and indexed.

It took a while to locate it mainly because we believed the second file was just a pre-need for the first a single. We established that simply eradicating the loaded cd.js file would be the solution.

Performance critique soon after employing the alternative

The working day we deleted the file, our key phrase visibility was at 41.70%, which was continue to 21% lower than pre-migration. 

Having said that, the working day just after deleting the file, our visibility elevated to 50.77%, and the next day it was practically back to typical at 60.11%. The believed targeted visitors behaved in the same way. What a aid! 

Quickly after implementing the solution, the organic traffic went back to pre-migration levels

Conclusion

I can imagine that quite a few SEOs have dealt with very small troubles like this. It seems trivial, but led to a considerable fall in visibility and targeted visitors during the migration. This is why I propose migrating in waves and blocking sufficient time for investigating technical mistakes right before and following the migration. Furthermore, retaining a shut seem at the site’s general performance within just the months right after the migration is very important. These are certainly my essential takeaways from this migration wave. We just accomplished the next migration wave in the starting of May well 2022 and I can state that so significantly no big bugs appeared. We’ll have two a lot more waves and comprehensive the migration ideally properly by the end of June 2022.

The functionality of the migrated web pages is practically again to standard now, and we will go on with the next wave.