A lot of people want to stop search engines from crawling their WordPress websites. It may sound odd to you, but some people believe that search engines should not crawl their sites until they are fully prepared. If you are an experienced website designer or web developer, it is easy for you to stop search engines from crawling your site. If you have developed a site using WordPress, it will not be difficult for you to stop Google from crawling your web pages. You just need to make a few changes. Some people love to work on live sites instead of creating local environments. And some individuals love to host their own projects for building their portfolio so that more and more clients get attracted. So they don’t need to index their sites because the portfolio is far more important for them.
Max Bell, the Customer Success Manager of Semalt Digital Services, describes here a practical guide to stopping the search engine from crawling your website and reason to do that.
Using the WordPress admin panel
The first and most important step is using WordPress admin panel for discouraging the search engines from indexing and crawling your web pages. If you are familiar with WordPress, you must be aware of the plugins that can be used in this regard. Go to the Settings option and scroll down. Here you can see a checkbox which you need to click to stop the search engines from indexing your WordPress site. Once you are done, you should not forget to save all those settings before closing the window.
Add the meta tags in the Header section manually
It is important to add the meta tags in the header and footer of your WordPress site. There are two options for it: you can either use a WordPress plugin to get this work done or go to the theme files and insert the meta tags manually. Adding the meta tags in the header and footer sections is important, and it will not let the search engines crawl your websites. Many people get confused about how to edit such files, but it is an easy and straightforward step. You just need to edit the Robots.txt files for blocking the web crawlers.
Edit the robots.txt files for blocking the web crawlers
It is possible to edit the robots.txt files for preventing the search engines from crawling your site. This file is used to handle the indexing of your web pages and articles. You can see which pages you want to index and which one should be de-indexed. Based on that, you can adjust the settings and don’t forget to save those settings. It is also possible to disallow all of the bots simultaneously. This file lets you perform that task efficiently and stops the web crawlers from crawling your website. For this, you need to create a couple of text files and add a specific code to them. Save all of those files before closing the window.
Protecting with passwords
You can stop the search engines from indexing your sites by using password-protected directories. For this, you should go to the cPanel of your site and click on an option (Password Protection Directories). This will open a new page for you, where you can see a couple of directories. Here, you have to locate the public-HTML directory where the username and password are inserted. Without the information in this section, no search engine can ever crawl your site.
With the above methods in mind, almost all webmasters and bloggers can stop the search engines from indexing their websites.