# # robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html User-agent: GPTBot Disallow: / # Common Crawlers User-agent: CCBot Disallow: / User-agent: LCC Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: VelenPublicWebCrawler Disallow: / # Disallow: DownloadBot HTTrack zum offline Browsen ausschließen User-agent: HTTrack 3.0 Disallow: / User-agent: BLEXBot Disallow: / User-agent: Zoominfobot Disallow: / User-agent: Exabot Disallow: / User-agent: Riddler Disallow: / User-agent: ltx71 Disallow: / User-agent: SEOkicks-Robot Disallow: / User-agent: SEOkicks Disallow: / User-agent: Cliqzbot Disallow: / User-agent: seoscanners.net Disallow: / User-agent: SearchmetricsBot Disallow: / User-agent: AhrefsBot Disallow: / User-agent: sogou spider Disallow: / User-agent: Sogou web spider Disallow: / User-agent: Sogou Disallow: / User-agent: SemrushBot-SA Disallow: / User-agent: BUbiNG Disallow: / User-agent: Barkrowler Disallow: / User-agent: Screaming Frog SEO Spider Disallow: / User-agent: SeznamBot Disallow: / User-agent: grapeshot Disallow: / User-agent: MJ12bot Disallow: / User-agent: Seekport Crawler Disallow: / User-agent: Seekbot Disallow: / User-agent: SemrushBot-SA Disallow: / User-agent: SemrushBot-BA Disallow: / User-agent: SemrushBot-SI Disallow: / User-agent: SemrushBot-SWA Disallow: / User-agent: SemrushBot-CT Disallow: / User-agent: SemrushBot-BM Disallow: / User-agent: SemrushBot-SEOAB Disallow: /
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)