🌐 Hosting
🌐
Hosting Checker
💰
Price Comparator
📦
Migration Checklist
💵
Cost Calculator

🔍 DNS & Network
🔍
DNS Lookup
🌍
DNS Propagation
📡
IP Lookup / WHOIS
🔌
Port Checker

🔒 Security
🔒
SSL Checker
🛡️
HTTP Header Checker
🔑
Password Generator
🤖
Robots.txt Generator

⚡ Performance
Speed Tester
⏱️
TTFB Tester
📡
Ping Tool
📊
Uptime Checker
📸
Screenshot Tool

</> Developer
{ }
JSON Formatter
64
Base64 Encoder
/./
Regex Tester
Cron Generator
📝
.htaccess Generator

☁️ Server & Cloud
🐘
PHP & MySQL Checker
☁️
AWS Cost Calculator

robots.txt
Generator

Build a custom robots.txt file visually. Control which bots can crawl which pages — with live preview.

Search Engine BotsUser-agent
Crawl RulesAllow / Disallow
Sitemap URLsSitemap:
Extra Options
Crawl Delay
Seconds between requests for all bots
Block AI Training Bots
GPTBot, CCBot, Claude-Web, PerplexityBot
Block Bad SEO Scrapers
SemrushBot, AhrefsBot, MJ12bot, DotBot
robots.txt

What is robots.txt?

robots.txt is a text file placed at the root of your website (e.g. https:/example.com/robots.txt) that tells search engine bots which pages they are allowed or disallowed from crawling. It follows the Robots Exclusion Protocol (REP) standard.

Key robots.txt directives

Important notes

robots.txt is a request, not a restriction. Well-behaved bots (Googlebot, Bingbot) respect it. Malicious scrapers may ignore it. To truly block content from being indexed, use the noindex meta tag or X-Robots-Tag header instead.

Frequently Asked Questions

Does blocking a URL in robots.txt remove it from search results? +
No. Disallowing a URL in robots.txt prevents crawling but not necessarily indexing. If other pages link to the blocked URL, Google may still index it (showing a URL without a description). To prevent indexing, use the noindex meta tag or X-Robots-Tag: noindex header.
Where do I put the robots.txt file? +
The file must be placed at the root of your domain: https:/yourdomain.com/robots.txt. On cPanel/shared hosting, upload it to your public_html folder. The filename must be lowercase: robots.txt.
Should I block /wp-admin/ in robots.txt? +
Yes — add Disallow: /wp-admin/ to prevent bots from crawling your WordPress admin area. Keep Allow: /wp-admin/admin-ajax.php to ensure AJAX functionality works correctly. WordPress generates a default robots.txt via virtual file if none exists.
Should I block GPTBot and AI crawlers? +
This is your choice. GPTBot (OpenAI), ClaudeBot (Anthropic), CCBot (Common Crawl) and others crawl web content for AI training. Blocking them prevents your content from being used for model training, but has no effect on SEO rankings.