Robots.txt Tester
Inspect how legard.com.ar controls crawler access, blocked paths, sitemap references and AI crawler rules.
Robots.txt Status
Robots.txt Status
Present
Score
84
/100
ยท Strong
View Full Robots.txt
Robots.txt Content Preview
User-agent: * Disallow: Crawl-delay: 60 # 60 segundos entre cada request Visit-time: 0300-1200 # 00:00 AM a 09:00 AM (America/Argentina/Buenos_Aires) Request-rate: 6/60m # permite indexar 6 documentos cada 60 minutos Request-rate: 3/60m 1200-0300 # permite indexar 3 documentos cada 60 minutos entre las 09:00 AM a 00:00 AM (America/Argentina/Buenos_Aires) Disallow: /cgi-bin/ Disallow: /admin/ Disallow: /*.sql$ # Bloqueo de las URL dinamicas Disallow: /*? # Bloqueo de busquedas Disallow: /?s= Disallow: /search # Bloqueo de trackbacks Disallow: /trackback Disallow: /*trackback Disallow: /*trackback* Disallow: /*/trackback # Permitir Google Webmaster Tool User-agent: Googlebot Allow: /*.js$ Allow: /*.css$ # wordpress Disallow: /xmlrpc.php Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/cache/ Disallow: /wp-content/themes/ Disallow: /*/xmlrpc.php Disallow: /*/wp-admin/ Disallow: /*/wp-includes/ Disallow: /*/wp-content/plugins/ Disallow: /*/wp-content/cache/ Disallow: /*/wp-content/themes/ # joomla Disallow: /administrator/ Disallow: /bin/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /layouts/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Disallow: /*/administrator/ Disallow: /*/bin/ Disallow: /*/cache/ Disallow: /*/cli/ Disallow: /*/components/ Disallow: /*/images/ Disallow: /*/includes/ Disallow: /*/installation/ Disallow: /*/language/ Disallow: /*/layouts/ Disallow: /*/libraries/ Disallow: /*/logs/ Disallow: /*/media/ Disallow: /*/modules/ Disallow: /*/plugins/ Disallow: /*/templates/ Disallow: /*/tmp/
User-agent Rules
| User-agent(s) | Allowed paths | Disallowed paths |
|---|---|---|
| * | No explicit Allow rules. |
|
| googlebot |
|
|
Blocked and Allowed Paths
| Blocked paths |
|
|---|---|
| Allowed paths |
|
| Crawl-delay | No Crawl-delay directive detected. |
Sitemaps Detected
No Sitemap directives found in robots.txt.
AI Crawler Policy
No explicit blocks were detected for common AI crawlers (GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended).
Issues Found
- robots.txt does not reference any sitemap URLs.
- Blocking CSS or JS may prevent search engines from rendering pages correctly.
Recommendations
- Add a Sitemap directive in robots.txt pointing to your primary XML sitemap.
- Document your AI crawler policy explicitly in robots.txt so future bots know how to treat your content.
- Ensure important pages, CSS and JavaScript assets are crawlable so search engines can fully render your site.