8 Reasons Your Site Might not Get Indexed on Google
Indexation is the cornerstone of good search engine optimization. If your website or specific pages of your website are not indexing, you need to find out why.
Here are eight reasons your site might not get indexed on Google:
- Sitemap Issues
All websites need to have a sitemap.xml or a list of directions that Google needs to follow to index it. If your sitemap is broken or has an issue, then it’s likely that Google won’t be able to index your site. If you are having indexation problem on any part of your website, you should revise and resubmit your sitemap.
- Crawl Errors
Sometimes, Google won’t index some pages because it cannot crawl them – even though it can still see them. To rectify this, go to Google Webmaster Tools and click on “Crawl Errors,” where you will see the unindexed pages.
- Meta Tags
Your indexing issue could be as a result of the NO INDEX prompt on your meta tag. This command prevents robots from crawling your website. Check to confirm that the page does not have this kind of meta tags: Meta Name = “ROBOTS” CONTENT =”NOINDEX, NOFOLLOW”>.
It is possible for your editor or developer to block the site using robots.txt. Unfortunately, this will result in your page or site not being indexed. Simply remove the entry from robots.txt to fix the issue.
- Blocked .htaccess
The .htaccess file is part of your site’s existence on the server that enables it to be accessible on the global web. And while it plays a crucial role, it can be used to prevent Google from crawling your site and indexing it all together.
- URL Parameters
The Webmaster Tool allows you to set URL parameters to let Google know the links that you don’t want to get indexed. But if you misconfigure these parameters, it can lead to pages from your website being dropped from Google index.
- Hosting Down Times
If Google spiders can’t reach your server when they try and crawl, then they won’t index it, for obvious reasons. Check your connectivity. It could be that your hosting company has an outage, or you’ve exchanged hosts.
- Duplicate Content
Copied content is a big problem with Google today, reason being, Google sees similar content on different URLs and gets confused regarding which one to index. As a result, it doesn’t index any URL. You can fix this by using selective robot.txt or 301 redirects to make sure Google only crawls one instance of the page.