-
I have intergrated my website with ezoic. After approval the sitemap of my website got couldn’t fetch error. I contacted Ezoic support team and they told me that the problem may be coming from Rankmath settings. Please i need your help.
-
Hello,
Thank you for reaching out and using Rank Math.
I took a look at your sitemap and it is looking fine – but there is an issue with your site’s URL versions, which could be causing the issue.
Your site has four different URL versions: http, https, www, and non-www but only the https/non-www version is returning your sitemap, please confirm that this is the exact URL submitted to Google, i.e. the exact URL you added in the sensitive data section.
Alternatively, you can solve this by redirecting your other website versions to the correct one. You can follow this guide to learn how to do this: https://rankmath.com/kb/url-incorrect-version-redirection/
I hope this helps to clarify the issue. If you have any questions, please let me know.
Regards
Hello, this is what Ezoic support team said about error occuring at the sitemap, “That’s precisely what we’re saying – you have a rule that only allows the Googlebot IP to crawl the site.
The Googlebot isn’t going to be accessing your site from that IP once integrated with Ezoic, it would be using an Ezoic IP address.It’s something you have setup with your robots.txt OR within the plugin you’re using to generate the sitemap. It’s not something Ezoic can fix for you, it’s related to your own crawler access rules that you’d need to change at the origin.” Please help.
hello Rankmath, i did what you said but it did not work, i need more help please.
Hello Rankmath, Still on the same problem and i need your help.
This is what Ezoic support team said after they saw the plugins in my website,“There’s several ‘indexing’ related plugins, as well as Rankmath SEO and Wordfence security that could all be restricting crawler access. We’d suggest testing by looking into the settings of each of those.
In particular, making sure rankmath doesn’t have any robots.txt or crawler access restrictions setup.”I need your help kindly.
Hello,
We’ve checked your robots.txt file and there is no issue with your robots.txt. However, we can see that all the variations of your sitemap URLs are still returning 200. Please check the screenshot in the sensitive data section for your reference.
Please note that only one of the 4 variations should show a 200 status code and the 3 remaining ones should show 301.
Once done, please clear your website cache and submit the sitemap again to your Google Search Console.
Let us know how it goes. Looking forward to helping you.
Thank you.
what should i do to make the remaining 3 show 301?
I have added the Code to the .htaccess File but it did not solve the problem.
Hello,
We’ve checked your sitemap again with https://httpstatus.io/ tool and we can see that all the variations of your sitemap URLs are still returning 200. Please refer to the screenshot in the sensitive data section.
It looks like your website is on non-www version. In this case, you can try to follow this URL to make a redirection from www to non-www: https://rankmath.com/kb/how-to-redirect-www-urls-to-non-www/
Meanwhile, you can remove the previously submitted sitemap and try to submit the following sitemap URL and see if that works for you:
https://yourdomain.com/?sitemap=1Please change
yourdomain.comwith the actual domain of your website.Let us know how it goes. Looking forward to helping you.
Thank you.
Thanks for your help guys.
Hello,
Glad that helped.
Please feel free to reach out to us again in case you need any other assistance.
We are here to help.
Thank you.
The ticket ‘siteme couldn’t fetch error’ is closed to new replies.