dimanche 26 février 2017

Google can't crawl my site because it is unable to access site's robot.txt

i am having this issue: “Google couldn’t crawl your site because we were unable to access your site’s robots.txt file.” My site: www.pufmag.com

I have already tried all the suggestions provided in the related problems but nothing seems to be working. I am using a plugin that has my site in maintenance mode. Could that be the issue?

I have made a robots.txt file and pushed it in the root directory but problem remains. Any kind of help is highly appreciated. Thank you!




Aucun commentaire:

Enregistrer un commentaire