Hi all,
Sooooo glad I found the forum.
I am having trouble getting GOOGLE to access my Google NEWS sitemap (news not a regular sitemap).
The error I get within GOOGLE WEBMASTER TOOLS is:
-----------------------
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.
-----------------------
I am using the WORD PRESS Google News Sitemap generator PLUGIN to create my google-new-sitemap.xml .
My robots.txt file is this:
User-agent: *
Disallow:
When I run the CRAWLER ACCESS test within GOOGLE WEBMASTER TOOLS I get this result for both the Googlebot and the Googlebot-Mobile:
Allowed by line 2: Disallow:
Detected as a directory; specific files may have different restrictions
This is an older domain that I bought off GoDaddy and has not been used for a few months. It has a clean history. The last time the ROBOTS.TXT file was downloaded was back in August 11, 2010 and the STATUS was 400(Bad Request).
Shouldn't it get updated? Or does it take a while.
PLEASE help me debug this situation. What do you suggest I do first, second, third...?
Thank you in advance.