User-agent: * Disallow: /cache/ Disallow: /index.php/*/search/authors Disallow: /index.php/*/search/titles Disallow: /index.php/*/search/* Disallow: /index.php/index/search/* ############################## User-agent: bingbot Crawl-delay: 5 User-agent: AhrefsBot Crawl-Delay: 10 User-agent: barkrowler Crawl-Delay: 10 User-agent: BLEXBot Crawl-delay: 10 User-agent: ClaudeBot Crawl-delay: 10 User-agent: MJ12Bot Crawl-Delay: 10 User-agent: Amazonbot Disallow: / User-agent: meta-externalagent Disallow: / User-agent: UT-Dorkbot Disallow: / User-agent: PetalBot Disallow: / User-agent: Bytespider Disallow: / ### ChatGPT User-agent: OAI-SearchBot Crawl-delay: 60 User-agent: ChatGPT-User Crawl-delay: 30 User-agent: GPTBot Crawl-delay: 10 ### Sermush User-agent: SemrushBot-SA Disallow: / # To block SemrushBot from crawling your site for different SEO and technical issues: User-agent: SiteAuditBot Disallow: / # To block SemrushBot from crawling your site for Backlink Audit tool: User-agent: SemrushBot-BA Disallow: / # To block SemrushBot from crawling your site for On Page SEO Checker tool and similar tools: User-agent: SemrushBot-SI Disallow: / # To block SemrushBot from checking URLs on your site for SWA tool: User-agent: SemrushBot-SWA Disallow: / # To block SplitSignalBot from crawling your site for SplitSignal tool: User-agent: SplitSignalBot Disallow: / # To block SemrushBot-OCOB from crawling your site for ContentShake AI tool: User-agent: SemrushBot-OCOB Disallow: / # To block SemrushBot-FT from crawling your site for Plagiarism Checker and similar tools: User-agent: SemrushBot-FT Disallow: / ### Perplexity User-agent: PerplexityBot Disallow: / User-agent: Perplexity-User Disallow: / ### Turnitin User-agent: TurnitinBot Disallow: / ### Yandex User-agent: YandexCalendar Disallow: / User-agent: YandexMobileBot Disallow: / User-agent: YandexImages Disallow: / ### Owler #User-agent: Owler #Disallow: / User-agent: GenAI Disallow: / ############################## # Section for misbehaving bots # The following directives to block specific robots were borrowed from Wikipedia's robots.txt ############################## # advertising-related bots: User-agent: Mediapartners-Google* Disallow: / # Crawlers that are kind enough to obey, but which we'd rather not have # unless they're feeding search engines. User-agent: UbiCrawler Disallow: / User-agent: DOC Disallow: / User-agent: Zao Disallow: / # Some bots are known to be trouble, particularly those designed to copy # entire sites. Please obey robots.txt. User-agent: sitecheck.internetseer.com Disallow: / User-agent: Zealbot Disallow: / User-agent: MSIECrawler Disallow: / User-agent: SiteSnagger Disallow: / User-agent: WebStripper Disallow: / User-agent: WebCopier Disallow: / User-agent: Fetch Disallow: / User-agent: Offline Explorer Disallow: / User-agent: Teleport Disallow: / User-agent: TeleportPro Disallow: / User-agent: WebZIP Disallow: / User-agent: linko Disallow: / User-agent: HTTrack Disallow: / User-agent: Microsoft.URL.Control Disallow: / User-agent: Xenu Disallow: / User-agent: larbin Disallow: / User-agent: libwww Disallow: / User-agent: ZyBORG Disallow: / User-agent: Download Ninja Disallow: / # Misbehaving: requests much too fast: User-agent: fast Disallow: / # # If your Site is going down because of someone using recursive wget, # you can activate the following rule. # # If your own faculty is bringing down your dspace with recursive wget, # you can advise them to use the --wait option to set the delay between hits. # #User-agent: wget #Disallow: / # # The 'grub' distributed client has been *very* poorly behaved. # User-agent: grub-client Disallow: / # # Doesn't follow robots.txt anyway, but... # User-agent: k2spider Disallow: / # # Hits many times per second, not acceptable # http://www.nameprotect.com/botinfo.html User-agent: NPBot Disallow: / # A capture bot, downloads gazillions of pages with no public benefit # http://www.webreaper.net/ User-agent: WebReaper Disallow: /
Folgende Keywords wurden erkannt. Überprüfe die Optimierung dieser Keywords für Deine Seite.
(Nice to have)