Table of Contents
Deindexed by Google
Being deindexed by Google is bad news, but it is not the end of world. If you take action quickly, your site can bounce back.
Why Did Google Deindex Your Site?
A website is typically deindexed for one of two reasons. Either: Why did Google deindex your site?
- Generally, a website is not indexed for one of two reasons.
- Google has performed a manual action on your site.
- Someone accidentally made a mistake in the website code to cause deindexing.
If a manual action were applied, you would have received a notification in Search Console detailing the violation.
The most mutual reason for this is that your site has done something to break Google’s webmaster quality guidelines.
- Google has taken manual action on your site.
- Someone fortuitously made a mistake in the website’s code to cause the deindexation.
If a manual action has been practical, you would have received a notification detailing the infraction in the Search Console.
The most shared reason for this is that your site has done something to break Google’s webmaster quality guidelines.
1. Artificial links to/from your website
It could mean several things:
- Low-quality guest post.
- You acquired too many inbound links to your site in a short period. To Google, this may appear to be buying links.
- Spam blog comments.
- Participate in link exchanges or link farms.
Solution:
- If you received a manual action for unusual links, the first thing to do is perform an audit on your backlink profile.
- During this process, you need to identify which links are relevant natural and unnatural or contain spam.
- When doing this analysis, you should ultimately end up with a list of links that need to be disputed.
- After submitting your disavowal file, you must submit a request for reconsideration that explains what happened and the steps you have taken to rectify the violation.
2. Cloaking
If the search engines and users two different sets of content or URLs, this is considered cloaking and can undoubtedly result in a manual action against a website.
Solution:
- Sometimes this can occur through no fault of your own.
- For example, if your website has content behind a paywall for subscribers, it might look like cloaking.
- Here you would need to structure your website using JSON-LD to specify the hidden content in this situation. Here Google has provided detailed instructions for this type of website.
- Different causes through no fault of your own could be that your website has been hacked. Hackers usually use cloaking as a way to direct users to spammy websites.
- It can usually be fixing quickly by running a scan on your website to identifying and fix the vulnerating pages. An extra option is to use a service like Sucuri and let them clean up the malware on your behalf.
- Upon clean-up, you must file a reconsideration request notifying you of the steps to resolve the issues.
3. Spammy Structured Markup
As with the different issues mentioned, general guidelines apply to structured data.
Here failure to follow the guidelines can result in a manual action and possible deindexing of a website.
Solution:
- Here if you received a manual action, your first step should be to look at some of the common causes of structured data manual actions and.
- However, based on the message provided to you in the search console, see where the issues lie.
- However, a different way of identifying potential issues is to use the Structured Data Testing Tool to see what errors show up.
- Later fixing the items in question, a reconsideration request must be filing as with any manual action.
Also Read: Why Hire a Digital Marketing Agency? – Reasons to Hire a Digital Marketing