After a long month of waiting from my last post, I am here with clear mind and back at work. Since my last post, lots of updates has been rolling out from Google – from indexing issue down to duplicate content. In this round up, we will cover some SEO gurus who have crack this core updates from Google.
Glen Gabe of gsqi.com knock out some steps on what areas we can check with this December 2020 algorithm update. If you are still wondering how to deal with, you might want to check these areas.
Glen point out a website with E-A-T factors yet it was still hit hard in 2020. Glen recommended tackling problems with news websites such as:
– indexing issues of pages with thin content
– prioritizing UX issues
– Unavailable videos and broken pages
– Sponsored content on site
After the December 2020 update, the site traffic cost grew by $140%.
Glen diagnosed another cause for an affiliate website that dropped more than half of its traffic. He suspects that low-quality content has been the main issue on the website. However, he did not concluded it yet. Affiliate Sites have been hit with this December update. The May update destroyed more than 40% of its traffic. Glen noticed an increasing pile of thin content on the website. Intrusive ads and mobile issues have been defined as well.
While this May update were rolling out, nothing changed until the December update. Several case studies were implemented and a lot of ideas have been gathered for sites that was hit in 2020. The next roundup will also cover the December update – the request indexing issue.
Nick Leroy provided us with a quick look on how many indexing request on GSC can be accepted at a time before you will reach your daily quota.
This has been also our issue back in October 2020 were “request indexing” feature was completely missing for several months. Many SEOs for sure were excited seeing it come back but they most of them might not notice that there is a slight change. Before it was took off, the limit has been tracked for about 50 URLs/day. There are case studies where Nick found out that:
– GSC will not process more than 12 submission in 24hr period.
– After receiving a CAPTCHA, his limit of request was 11 instead 12.
– None of the submissions were indexed easily.
These limits may be concerning especially when we are relying on fast indexing or a new site is about to launch. It is something for us to watch and care for the guides we have right now.
Do you remember when Rank Club released their PBN best practice guide before? Now, they bring us data-backed look on how to use PBN for 2021.
The guide turns out to place the PBN links as Tier 1 and Tier 2. This Tier 1 link are built directly for your website. Tier 2 links are to support your incoming links and increase their authority.
You might want to visit rankclub.io for more FAQs on PBN setup guide. I, myself learned something new with this links. I use Moz advice on how to find keywords sometime when I can’t rely to any historical data. This will lead us to the next roundup.
Imogen Davies brings us an in-depth look on some areas when doing our keyword research with missing historical data. Google has confirmed that 15% of daily searches are combinations of what was being searched.
A lot of opportunities are being taken for granted on those searches. Ranking them is too difficult also when there is no reference point at all. The standard keyword research tools aren’t going to help since these are built around with data analysis.
There are three alternative strategies that Imogen recommends:
– Considering what People Also Ask boxes
– Scraping autosuggest
– Related keyword themes
Imogen recommends that we follow-up on either of these strategies through grouping everything within the topic and themes. This will help plan your pillar content and get some engagement.
We have known that the last core update has been completed. However there some rumors that there is some slight update happening in January. Barry Schwartz tracked down SEOs experiencing strange fluctuations and reports from online forums.
RankRanger, SERPMetrics, SEMRush – these tools mostly identified the spikes that occurred last Jan 26 and 27.
It is really hard to say what is really “duplicate content” means. John Mueller helped to clarify the meaning of “duplicate content” through a presentation with content in different formats. He said that a content published in different formats, Google does not considered it as a duplicate content.
For example, if you repurpose a video as an article or vice versa – Google does not count it as duplicate. The same goes for a two article with word-for-word identical but different formats. It sounds like a good news for SEOs who is trying to develop their content from videos as a new format.
As a final note, SEO is still evolving. From content down to voice search – so innovate! If you’re looking to boost your business through SEO, maybe we can work together. Send me a message today!