If you encounter the "Excluded by 'noindex' tag" error in Google Search Console, it means that Googlebot cannot index certain pages on your website because they have a "noindex" tag in their HTML code. This tag explicitly tells search engines not to index or follow the content on the page. To fix this error, you need to either remove the "noindex" tag or change the tag to allow indexing. Here's how to do it:
Identify the Pages with the Error: In Google Search Console, go to the "Index" section and select "Coverage." Look for the pages listed under "Excluded by 'noindex' tag." This will show you which pages are affected.
Inspect the HTML Code: For each of the pages listed, you'll need to inspect the HTML code to find where the "noindex" tag is applied. There are a few places to check:
<head>
section, look for a <meta name="robots" content="noindex">
tag.X-Robots-Tag: noindex
header.Modify or Remove the 'noindex' Tag: Depending on where you find the "noindex" tag, you can take the following actions:
In the HTML <head>
section (Meta Tag):
<meta name="robots" content="noindex">
tag.<meta name="robots" content="index, follow">
to allow indexing and following.In the HTTP Header (X-Robots-Tag):
In the robots.txt file:
Crawl and Index the Updated Pages: Once you have made the necessary changes to the "noindex" tags or robots.txt file, you can request Google to re-crawl and re-index the updated pages. To do this, go back to Google Search Console, select the specific page, and use the "Request Indexing" feature.
Monitor Google Search Console: Keep an eye on Google Search Console to check if the "Excluded by 'noindex' tag" error is resolved for the affected pages. It may take some time for Google to re-crawl and re-index the pages.
Check for Other SEO Best Practices: While you're working on this issue, make sure your website follows other SEO best practices, such as having unique and valuable content, optimizing your meta tags, and ensuring a good user experience.