Maintenance Plugin – apply 503 status to your site to prevent search engines from accessing it.
Add a few lines of code in robots.txt
Some files on WordPress do not need to be indexed. By using robots.txt, you can tell search engines to not access these folder path or files.
Below is the recommended folder path and files to be blocked from search engines.
Code in Robots.txt
Create XML sitemaps
XML sitemaps can help search engines know how your site is structured so they can crawl and index your site faster. Use a plugin like Google XML Sitemaps (the sitemaps it generates actually works for other search engines also not just Google).
Add sitemaps location in robots.txt
Add the following line in robots.txt to tell search engines where your sitemaps are located. You can add other kinds of sitemaps (images and videos) as well.
This will help every search engines from around the world (Yandex, Baidu, Naver etc) as well as Google and Bing to know where your sitemaps are located.
Submit your sitemaps in both Google and Bing Webmaster Tools
You can tell Google and Bing where your sitemaps are located by using the Webmaster Tools. You can also monitor how many pages have been indexed.
Disable XML-RPC using a plugin
XML-RPC is used when you want to publish WordPress posts with a program like Windows Live Writer.
XML-RPC used to be disabled by default due to security reasons and there was an option to enable it up until WordPress version 3.5.
Now it is enabled by default and there is no option to turn it off. Not to worry, you can still turn it off using this plugin.
I am a SEO consultant who happens to be interested in affiliate marketing.
Unlike general SEO, websites built for affiliate marketing require completely different mindset and strategy.
Join my journey to generate passive income using SEO.