# Specifications for robots: # A Disallow field with no value indicates, that all URLs # can be retrieved User-agent: * Disallow: /cart Disallow: /club Disallow: /order Disallow: /shop.php Disallow: /cart.php Disallow: /club.php Disallow: /order.php Crawl-delay: 10 # www.80legs.com made over 400 requests per minute (DDOS) User-agent: 008 Disallow: / User-agent: SemrushBot Disallow: / User-Agent: The Knowledge AI Disallow: / # Blexbot respektiert nicht den Crawl-Delay User-agent: BLEXBot Disallow: / # does not respect Crawl-delay User-agent: SeekportBot Disallow: / # multiple requests to the same pages from different IPs User-agent: Bytespider Disallow: /
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)