Robots.txt Tester
Inspect how mrseamiami.com controls crawler access, blocked paths, sitemap references and AI crawler rules.
Robots.txt Status
Robots.txt Status
Present
Score
100
/100
ยท Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: * Allow: / Sitemap: https://mrseamiami.com/sitemap.xml # Sitemaps Sitemap: https://www.mrseamiami.com/sitemap.xml Sitemap: https://www.mrseamiami.com/image-sitemap.xml # Additional SEO directives User-agent: * Allow: / # Disallow admin and API endpoints Disallow: /admin/ Disallow: /api/ Disallow: /seo/ # Crawl-delay for respectful crawling Crawl-delay: 1
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| * |
|
No explicit Disallow rules. |
| * |
|
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | 1.0 seconds |
Sitemaps Detected
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Recommendations
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.