Preparing Edifyle
Private • Local • Fast
Create search engine-friendly instructions for web crawlers. Control which parts of your site should be indexed.
User-agent: * Allow: / User-agent: * Disallow: /admin User-agent: * Disallow: /private Sitemap: https://yoursite.com/sitemap.xml
Misconfiguring `robots.txt` can accidentally de-index your entire website from Google. Always test your rules in Google Search Console before deploying.
4.9/5
User Rating
100k+
Files Processed
256-Bit
Encryption
100%
Free Forever
Add rules for specific user-agents (or * for all).
Set Disallow paths to block folders (e.g. /admin).
Add your Sitemap URL.
Download the robots.txt file and place it in your root directory.
Everything you need to know about Robots.txt Generator
It identifies which web crawler the rule applies to. '*' means all bots. 'Googlebot' means only Google.
It must be named 'robots.txt' and placed at the top-level root of your domain (e.g., example.com/robots.txt).
Handpicked alternatives to streamline your workflow