What could possibly be blocking your site for not displaying on search engines? This you may wonder. Even when excerpt from the content of your web page is directly copied and pasted on the search engine, still, your web page is nowhere near the search results displayed.
Before forging ahead, do not mistake ‘blocking or not indexed’ for ‘low ranking result.’ Some pages may be indexed but won’t appear among the best ranked pages. So, this does not mean your site is blocked / not indexed on search engines.
To ensure that your site is truly blocked / not indexed, you need to type in site:mydomain.com on the search bar. If your site is indexed, it will display the number of pages. If not, the query will return no result.
In the case where ‘no result’ is returned, it is probably due to one or more of the below 6 reasons. Read these carefully and apply the solution(s) where necessary.
1. Your site is new
A newly launched site will take some time before it starts appearing on search engines. Displaying site on search engines quicker requires the submission of a sitemap to the preferred search engines for indexing.
For site with few web pages, submitting a sitemap is a way of informing Google, Bing / Yahoo, Ask, Yandex etc. that you are present on the web. So, this makes it quicker for the engines to crawl and index your site.
To understand the process of submitting a sitemap, check out this page.
Solution: exercise some patience
Yes, you are reading it correctly. You have to exercise some patience if your site is new. Wait for few weeks after sitemap has been submitted to allow search engine to index your web pages.
Without submitting a sitemap, it may take months before your site can be found by search engine bots. Of course this will depend on the popularity of your site. The more popular your site is, the faster it can be noticed by bots.
While submitting a sitemap is recommendable, it does not mean your site easily gets indexed on search engine. The process takes time as there are other sites being submitted. So, you have to wait for the process to take effect on your turn.
2. Robots.txt is blocking the search engine bots
Search engine bots are crawlers or spiders that access the root directory of your site scouting for information or content. When the bots find your site, they index the pages. And whenever a query matches any of the content on your web pages, the bots will present your content on search results.
Basically, the robots.txt file is used to block bots from accessing sites directories. If the information on the file has been wrongly specified, it has to be corrected.
Solution: modify robots.txt file
Go through your robots.txt file on the root directory of your site to modify the content. This may be blocking the crawlers from accessing your site.
Read this well explanatory article to learn more about Robots.txt and how it works. It will provide clear guidance when using the file to give instruction to web crawlers for your site.
3. You have ‘noindex’ meta tag specified
Noindex is an alternative to robots.txt file. It is also used for blocking search engine bots from crawling particular pages on a site.
The difference between ‘noindex’ and robots.txt is: ‘noindex’ instruction is specified on the head section of HTML markup, while robots.txt is created as a file and saved on the root directory of each site.
Solution: check for ‘noindex’ tag
Check the head section of your HTML markup to ensure that there is no such meta tag specified in it.
<title>Site Not Displaying on Search Engines</title>
<meta name="robots" content="noindex">
<meta name="googlebot" content="noindex">
‘Robots’ implies all search engines, and ‘googlebot’ implies for Google search engine only. Deleting these tags will give permission to search engine bots to crawl and index your site.
4. No external links pointing to your site
You do not have links from other sites pointing to your web pages. When your site is not displaying on search engine, it may be due to its level of popularity. So, getting backlinks creates credibility for your web pages, thus they get noticed by bots.
Solution: create external links for your site
You can start applying SEO link building techniques to your site. This will boost its popularity, and in turn, gets noticed by search engine bots quicker.
Build links by: creating social media account(s); engaging in off site activities; engaging in offline activities.
5. Your site has different protocols
If your site has more than one protocol (eg. http:// and https://), it may be that your site is being indexed under a different domain. For instance: you may be looking up your site with http://www.mydomain.com URL on search engine, whereas, the search engine indexes your web pages on https://www.mydomain.com URL.
Solution: add all site protocols to search engine
Submit each of your site protocol separate to Google Search Console if you have more than one. Then you will be able to monitor them separately. Do similar thing for other search engine platform if needed.
6. Your site has been blocked
Engaging in activities that are against the policy of any search engine is a punishable offence. That means, your site may be penalised by being temporarily or permanently removed from the search engine results and index.
Also, your site will not display if the domain name was previously used by someone else and the person got penalised.
Solution: remove site from block list
Contact the search engine platform where your site is blocked, and explain your situation to them. Within a reasonable period of time, you site should be unblocked if the reason for blocking it is not of your doing.
Additionally, if you are not implementing SEO strategies yourself, avoid the so-called ‘experts’ that use cloaking as part of their strategies.
The 6 points shared above are the most common reasons causing some sites not to display on search engine result pages. Address any of these problems, then your site will perform better.
And as stated earlier in this article, avoid taking ‘site not displaying on search engines’ for ‘poor / low ranking result.’ These are two different issues that should be resolved with different techniques.
Meanwhile, read the below tip to help maintain your site and keep it far from unexpected errors or issues.
Tip: maintaining your site
When all errors had been corrected from your site, it is highly recommended to keep maintaining it to avoid any future errors. Doing this keeps your site in line with the competition and get it updated for the lastest SEO techniques.
To be particular about site performance, you need to run your site for SEO checklist, analyse for errors and analyse for traffic. Do this at least twice a month. This will provide insights to any problems your site may be having.
Use SEO checklist
This is one of the first things to do when implementing SEO strategies. Use this SEO Checklist to check which of the SEO strategies are still to be implemented on your site. Moreover, read these interesting articles from Search Engine Journal and Hubspot on how you can stay up to date with SEO techniques.
Analyse for error
To resolve any issues with your site, you need to test the page for errors using at least two online tools. With these tools, you will get detailed information on how to improve your site in case of any error. I will personally recommend using Google PageSpeed Insights and Varvy.
Analyse for traffic
Collect the reports provided by Google Analytics, Google Search Console and Bin Webmaster Tools about your site traffic. Use the reports to know which channels are attracting more visitors to your site, then put more attention on those areas.
If you are interested in my SEO service, check out this page for more.