-
Hello,
I’m writing because I noticed a very high number of page for my website myproroofing.com are currently flagged on Google Search Console as not indexed because due to robots.txt blocking them. An example is https://www.myproroofing.com/roof-replacement/ for which I ran a live test on GSC directly as well as other 3rd party validator tools and got a message that the URL is not available to Google.
I tried to follow some of the available guides in your knowledge base to try to troubleshoot the issue myself but it does not seem to have solved it. I don’t know what else to do.
The ticket ‘High number of pages blocked by Robots.txt’ is closed to new replies.