Robots.txt Tester
Inspect how soyfira.com controls crawler access, blocked paths, sitemap references and AI crawler rules.
Robots.txt Status
Robots.txt Status
Present
Score
84
/100
ยท Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: * Allow: / Disallow: *?lightbox= # Optimization for Google Ads Bot User-agent: AdsBot-Google-Mobile User-agent: AdsBot-Google Disallow: /_partials* Disallow: /pro-gallery-webapp/v1/galleries/* # Block PetalBot User-agent: PetalBot Disallow: / # Crawl delay for overly enthusiastic bots User-agent: dotbot Crawl-delay: 10 User-agent: AhrefsBot Crawl-delay: 10 Sitemap: https://www.soyfira.com/sitemap.xml # Auto generated, go to SEO Tools > Robots.txt Editor to change this
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| * |
|
|
| adsbot-google-mobile | No explicit Allow rules. | No explicit Disallow rules. |
| adsbot-google | No explicit Allow rules. |
|
| petalbot | No explicit Allow rules. |
|
| dotbot | No explicit Allow rules. | No explicit Disallow rules. |
| ahrefsbot | No explicit Allow rules. | No explicit Disallow rules. |
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | 10.0 seconds |
Sitemaps Detected
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Issues Found
- At least one user-agent has Disallow: / which blocks the entire site.
- Wildcard (*) patterns should typically be used with explicit paths, e.g. /path/*.
Recommendations
- Avoid blocking the entire site (Disallow: /); restrict only sensitive or low-value paths instead.
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.