Loading calculator...
What is the Robots.txt Generator?
The Robots.txt Generator creates a file to tell search engine crawlers which pages to access or ignore.
How to Use This Calculator
1
Specify the User Agent (* for all).
2
Enter paths to Allow.
3
Enter paths to Disallow.
4
Click 'Calculate'.
Example Calculation
Disallow: /admin
User-agent: *
Disallow: /admin
Frequently Asked Questions
Pro Tips
- Always link your sitemap in the robots.txt file.
- Be careful not to disallow your entire site (Disallow: /).