# robots.txt for https://www.nordhausen-wiki.de # erstmals erzeugt am 05.02.2014 # aktualisiert am 17.01.2024 # Please note: There are a lot of pages on this site, and there are # some misbehaved spiders out there that go _way_ too fast. If you're # irresponsible, your access to the site may be blocked. User-agent: * Disallow: /wiki/Spezial:Suche Disallow: /wiki/Special:Suche Disallow: /wiki/Spezial:Zufällige_Seite Disallow: /wiki/Spezial:Zuf%C3%A4llige_Seite Disallow: /wiki/Special:Zufällige_Seite Disallow: /wiki/Special:Zuf%C3%A4llige_Seite Disallow: /wiki/lexikon/ Sitemap: https://nordhausen-wiki.de/sitemap.xml # Some bots are known to be trouble, particularly those designed to copy # entire sites. Please obey robots.txt. Thank you!!! :-) User-agent: dotbot Disallow: / User-agent: barkrowler Disallow: / User-agent: SeekportBot Disallow: / User-agent: BLEXBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: SemrushBot-SA Disallow: / User-agent: PetalBot Disallow: / User-agent: UbiCrawler Disallow: / User-agent: DOC Disallow: / User-agent: Zao Disallow: / User-agent: sitecheck.internetseer.com Disallow: / User-agent: Zealbot Disallow: / User-agent: MSIECrawler Disallow: / User-agent: SiteSnagger Disallow: / User-agent: WebStripper Disallow: / User-agent: WebCopier Disallow: / User-agent: Fetch Disallow: / User-agent: Offline Explorer Disallow: / User-agent: Teleport Disallow: / User-agent: TeleportPro Disallow: / User-agent: WebZIP Disallow: / User-agent: linko Disallow: / User-agent: Microsoft.URL.Control Disallow: / User-agent: Xenu Disallow: / User-agent: larbin Disallow: / User-agent: libwww Disallow: / User-agent: ZyBORG Disallow: / User-agent: Download Ninja Disallow: / User-agent: Flamingo_SearchEngine Disallow: / #ERGÄNZUNG # Observed spamming large amounts # and ignoring 429 ratelimit responses, claims to respect robots: # http://mj12bot.com/ User-agent: MJ12bot Disallow: / # Block Ahrefs User-agent: AhrefsBot Disallow: / # Block SEOkicks User-agent: SEOkicks-Robot Disallow: / # Disallow: Sistrix User-agent: sistrix User-agent: SISTRIX Crawler User-agent: SISTRIX Disallow: / # Bot der Leipziger Unister Holding GmbH User-agent: UnisterBot disallow: / # Block Uptime robot User-agent: UptimeRobot/2.0 Disallow: / User-agent: 008 Disallow: / # Block Ezooms Robot User-agent: Ezooms Robot Disallow: / # Block Perl LWP User-agent: Perl LWP Disallow: / # Block BlexBot User-agent: BLEXBot Disallow: / # Block netEstate NE Crawler (+http://www.website-datenbank.de/) User-agent: netEstate NE Crawler (+http://www.website-datenbank.de/) Disallow: / # Block WiseGuys Robot User-agent: WiseGuys Robot Disallow: / # Block Turnitin Robot User-agent: Turnitin Robot Disallow: / User-agent: TurnitinBot Disallow: / User-agent: Turnitin Bot Disallow: / User-agent: TurnitinBot/3.0 (http://www.turnitin.com/robot/crawlerinfo.html) Disallow: / User-agent: TurnitinBot/3.0 Disallow: / # Block Heritrix User-agent: Heritrix Disallow: / # Block pricepi User-agent: pimonster Disallow: / User-agent: Pimonster Disallow: / # Block Searchmetrics Bot #User-agent: SearchmetricsBot #Disallow: / # Block Eniro User-agent: ECCP/1.0 ([email protected]) Disallow: / # Block Baidu User-agent: Baiduspider User-agent: Baiduspider-video User-agent: Baiduspider-image User-agent: Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html) User-agent: Mozilla/5.0 (compatible; Baiduspider/3.0; +http://www.baidu.com/search/spider.html) User-agent: Mozilla/5.0 (compatible; Baiduspider/4.0; +http://www.baidu.com/search/spider.html) User-agent: Mozilla/5.0 (compatible; Baiduspider/5.0; +http://www.baidu.com/search/spider.html) User-agent: Baiduspider/2.0 User-agent: Baiduspider/3.0 User-agent: Baiduspider/4.0 User-agent: Baiduspider/5.0 Disallow: / # Block SoGou User-agent: Sogou Spider Disallow: / # Block Youdao User-agent: YoudaoBot Disallow: / # Block Nikon JP Crawler User-agent: gsa-crawler (Enterprise; T4-KNHH62CDKC2W3; [email protected]) Disallow: / # Block MegaIndex.ru User-agent: MegaIndex.ru/2.0 Disallow: / User-agent: MegaIndex.ru Disallow: / User-agent: megaIndex.ru Disallow: / # Misbehaving: requests much too fast: User-agent: fast Disallow: / # Sorry, wget in its recursive mode is a frequent problem. # Please read the man page and use it properly; there is a # --wait option you can use to set the delay between hits, # for instance. # User-agent: wget Disallow: / # # The 'grub' distributed client has been *very* poorly behaved. # User-agent: grub-client Disallow: / # # Doesn't follow robots.txt anyway, but... # User-agent: k2spider Disallow: / # # Hits many times per second, not acceptable # http://www.nameprotect.com/botinfo.html User-agent: NPBot Disallow: / # A capture bot, downloads gazillions of pages with no public benefit # http://www.webreaper.net/ User-agent: WebReaper Disallow: / # # Friendly, low-speed bots are welcome viewing article pages, but not # dynamically-generated pages please. # # Inktomi's "Slurp" can read a minimum delay between hits; if your # bot supports such a thing using the 'Crawl-delay' or another # instruction, please let us know.
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)