0

I am getting DNS error from last few 5 days in my Google Webmaster Tools. That's the reason I would like to confirm my robots.txt file.

Is the below robots.txt file is correct or not?

User-agent: *
Allow: /

Sitemap: http://example.com/sitemap.xml

Please help me to solve this doubt.

Zistoloen
  • 10,056
  • 6
  • 36
  • 59
ashutosh
  • 316
  • 1
  • 3
  • 14

2 Answers2

1

To allow all:

User-agent: *
Disallow: 

To disallow all:

User-agent: *
Disallow: /

It's a bit confusing, and unintuitive I know. As a simple way of generating them I tend to use the generator tool.

Zistoloen
  • 10,056
  • 6
  • 36
  • 59
sam
  • 4,645
  • 5
  • 40
  • 68
1

If you're getting a DNS error, it's not related to your robots.txt - it's related to your DNS settings. You can check for DNS errors with online tools like: DNS Health

If you need additional help with that, I'd suggest making a screenshot of your DNS table settings from your DNS service provider and asking another question with that screenshot pasted in it (i.e., example).

By the way, the accepted answer regarding your other robots.txt question is correct. I would stick with that for what you were asking about previously - robots.txt directives are not the cause of any DNS errors.

dan
  • 15,153
  • 11
  • 46
  • 52