Process for Recovering Rankings After Google Algorithm Updates
Google algorithm updates are notorious for dropping viewership rankings on websites. When these incoming bombs (the algorithm updates) dropped on our website, my legs used to tremble.
Back then, the level of my paranoia was so high that I used to check my website’s rankings daily to find out whether my website was blemished by these updates or not. Almost repetitively, I saw those infamous red arrows pointing down on my rankings; not a pleasant experience, to say the least.
I started reevaluating my website. Does it still have room for improvement? Is my content not expertly sound? Did the content I published last week cause the downfall of my website due to it correlating with the update? These were some of the numerous questions that dangled around in my head, be it night or day.
And this is just the first day when the update was announced. Imagine how the chaos spread through the office almost immediately.
It was a mental and professional nightmare.
However, after putting in tons of hours’ worth of research, it’s safe to say that I have found a cure for this ailment. This is credited to my increased experience, and a thoroughly organized assessment and recovery process.
It has worked wonders for me, hence I consider it my duty to share the magic with you as well, for your benefit.
This post will help you, through my systematic process for ranking recovery, in technically dealing with the consequences of an algorithm update. You will find out how to approach this issue with greater technicality, leading to smoothening your website’s performance after the update.
Here we go!
1. Keep Calm
First and foremost, you should never panic when Google drops a new update. Panicking clouds your incoming plans of action, resulting in a bad start in dealing with the problem.
It is a helpful practice to understand what these updates are for, what they are designed to do. These updates will always be constant and continuous, therefore you should not loathe them. Instead, you should yearn to understand what Google is trying to achieve with this update such as its aims.
It is also necessary to realize that Google isn’t your sworn enemy; dropping these updates to rattle your website. It is a common occurrence to see Google dropping a series of mini updates to flatten the radical change caused by the update.
So, it is crucial to wait out the storm. What the majority does is that they take intensely reactive decisions such as revamping the whole website after seeing a worsened ranking. This is not recommended because to learn the effects, and what caused them, you need to wait.
I would suggest waiting for the kettle to boil before removing it from the stove. Learning is a continuous process and monitoring should be a big part of learning. Examine the situation and take decisions accordingly.
2. Analyze Specific Keyword/Page Drops
Previously, we observed and analyzed websites in context to a wide range of sitewide metrics. This was done to extrapolate the extent to which a website was severed by an algorithm update. This was undertaken through immense sitewide audits which helped in assessing the factors affecting a website.
We later learned, through the hard way, that this is a flawed way to react. You do not burn the whole haystack to find the needle; you look for it carefully and thoroughly.
We now make a concentrated effort to find the specific, higher-ranking keywords from which we initiate our analysis. More often than not, it’s a couple of pages that make up the majority of the issue.
These problematic pages might contain information that is not relevant anymore, making it outdated. Or, the new update might have discovered overlapping and commonly duplicated content.
We believe that Google is increasingly looking at websites on a page-by-page basis to calculate rankings. So, our first working plan of action should be rightfully based on page-level changes that may be required instead of taking any sitewide action.
Sitewide changes may result in worsening the actual situation. Instead, we should work on fixing our website’s key components such as pages and big keywords through a newly crafted analysis of the current situation.
3. Determine the Theme Among Drops
As we all know, all keywords are not the same; they do not have a uniform value. Relying on, and basing your findings on a generic keyword-evaluating matrix is a redundant effort. Evaluating the value, and contribution to ratings provided by these various keywords should be assessed individually.
As a marketing agency for law firms, we often see a page lose hundreds of keywords from the first page. Immediately, we are alarmed by this happening. Our first step is setting up an alerting phase until we can dig into the issue further.
At times, it’s relieving to find that some of our pages seem to be getting rid of various lower-value, redundant, and broad-based keywords. This is a blessing because it almost always improves the standing of our other prized keywords. Typically, a reduction in country-wide keywords such as ‘criminal lawyer’ is seen. This causes gains to be achieved for a local version like ‘criminal solicitor London.’
Although we don’t prefer losing such keywords, even though they are broader, it’s a far better scenario than losing our real targets.
After dealing with the value of keywords changing, and the change in their relative importance, we will try to assort them, to effectively formulate ‘theme’ theories.
Below are questions that this theory of ours answers:
- What is common amongst these keywords?
- The similarities and differences so far
- The aspects to these affected pages
- Are there any other internal changes we’ve made that can be linked to all those affected?
- Do these keywords conflict with the update’s new guidelines?
- Are they redundant after the update?
- Did they get updated?
Once we have a better grip on the situation, after understanding the importance of different keywords individually, we tweak the sites based on the insights from our observations to adapt to the search algorithm.
Mind you, the theory won’t necessarily reveal what exact changes were made in the algorithm update. Also, one thing does not always lead to another. there’s a possibility that the changes in our website rankings are not due to the changes made in the Google algorithm.
Changing trends, dynamic keywords, newer competition, et cetera, should all be accounted for.
However, the insights found here through individual keyword analysis can help in making informed decisions, and increase the rankings and organic traffic of websites.
4. Run Tests
At this point, we have a better understanding of the actual problem through analyzing exact pages and keywords that were either valued or devalued by the algorithm update.
After understanding these keywords and pages individually, we can move on to an in-depth analysis of the important website components.
Our process involves the following:
1. Monitor your competitors – We identify whether competitors benefitted or were worse-off after the algorithm update through checking their keywords and pages.
Through this comparison, we can understand where we currently stand, effectively. If our competitors are better off, it is time to rethink our keywords and go for the popular ones.
2. Improving website keywords through Surfer’s Content Auditing feature – This is an extremely helpful, and organized tool. Designed for SEO content analytics, the tool splits the page into its constituents. This allows the reformation of the page, making it easier to focus and optimize on particular target keywords.
3. Run the page through GTMetrix and Pingdom tests – Since Core Web Vitals are now considered as a ranking factor, it is imperative to take advantage of these tools. Use them to observe those factors that produce worryingly low results.
These tools also provide you the service of recognizing files and attachments which are responsible for hindering the loading speed of your website; A crucial feature in improving website traffic.
4. Backlink change comparison – this is crucial for improving your website after an update. Sometimes, after an update, pages gain bad backlinks; or, they naturally occur with time. Similarly, the page can also easily lose good authoritative do-follow links.
If your traffic has declined after the update, it is highly recommended for you check all of your backlinks.
5. If no major causes avail themselves, we’ll do an in-depth and detailed manual review – No matter how good these tools sound, they are not the stuff of dreams. They have limitations too and are not entirely reliable in our analysis process.
If they barely pick any issues or do not pick any, we will jump in and do a thorough manual review of your site which involves: looking through pages and keywords for any issues.
This is the basic outline of how we conduct our tests. Every situation is certainly uniquely different, requiring numerous other measures. We also provide advice and are completely transparent in all our work.
5. Design and Enact Changes
At this point, we believe we have a few ideas regarding improvements which can be made to our clients’ pages. Mind you, these ideas and their implications are tried and tested.
Equally important for us is to document the change, adding to the transparency point mentioned earlier. This is done by using the annotations feature on Accuranker. This gives us a concrete way to look at how our theories are being received by Google.
This process is all about baby steps. Patience is the name of the game. If we have a handful of ideas that are expected to yield positive results, we’ll always start with one. This gives us a solid path that helps gives us accurate solutions, validating our theories, or moving on to the next measure without any confusion.
In Conclusion, Evolve with the Updates
As we’ve reached newer heights as a company, we will always stay true to our roots. Through failing on many occasions, we have realized that taking serious measures without any concrete evidence is similar to asking for trouble.
Moreover, we have realized that patience is key. If you will rush for results, making drastic decisions and actions in the process, your website will certainly be worse-off as compared to its previous, already update-stricken state.
Google algorithm updates are dynamic, hence we need a dynamic set of solutions for its consequences. Relying on a rigid, and definitive system is a terrible mistake made by us and many others in the past.
Looking for solutions through less complex, more flexible repeatable systems is the way to go.
This dynamic movement of ours has allowed us to strengthen our ideas, and ways to challenge the update’s detrimental effects on website visibility.
Our idea and process might not be perfect or guaranteed to work, but it is sensible and sound. We hope this proves helpful for those who feel burdened when a new update is released.