This site can not be crawled by the Seobility bot. The robots.txt file restricts access for our bot. No URLs can be crawled that deny access to the user agent Seobility or *.
Problem URL: http://wirmachenbunt.de
# Dynamisch generierte robots.txt zur optimierten Sicherheit und Steuerung des Bot-Traffics # Empfahlung zur Auslieferung einer robots.txt mit individuellen Inhalten: # 1. Leere robots.txt im DocumentRoot der Domain erstellen # 2. Inhalte dieser dynamischen robots.txt mit individuellen Anpassungen einfuegen # Block 1: Bots in Block 1 ist das crawlen untersagt User-agent: AI2Bot User-agent: Ai2Bot-Dolma User-agent: aiHitBot User-agent: Amazonbot User-agent: Applebot-Extended User-agent: Brightbot 1.0 User-agent: Bytespider User-agent: ClaudeBot User-agent: cohere-ai User-agent: cohere-training-data-crawler User-agent: Cotoyogi User-agent: Crawlspace User-agent: Diffbot User-agent: FacebookBot User-agent: Factset_spyderbot User-agent: FirecrawlAgent User-agent: FriendlyCrawler User-agent: Google-Extended User-agent: GoogleOther User-agent: GoogleOther-Image User-agent: GoogleOther-Video User-agent: GPTBot User-agent: iaskspider/2.0 User-agent: ICC-Crawler User-agent: ImagesiftBot User-agent: img2dataset User-agent: imgproxy User-agent: ISSCyberRiskCrawler User-agent: Kangaroo Bot User-agent: Meta-ExternalAgent User-agent: Meta-ExternalFetcher User-agent: NovaAct User-agent: omgili User-agent: omgilibot User-agent: Operator User-agent: PanguBot User-agent: Perplexity-User User-agent: PerplexityBot User-agent: PetalBot User-agent: Scrapy User-agent: SemrushBot-OCOB User-agent: SemrushBot-SWA User-agent: Sidetrade indexer bot User-agent: TikTokSpider User-agent: Timpibot User-agent: VelenPublicWebCrawler User-agent: Webzio-Extended User-agent: YouBot Disallow: / # Block 2: Bots duerfen unter diesen Bedingungen crawlen User-agent: AhrefsBot User-agent: Applebot User-agent: Bingbot User-agent: BraveBot User-agent: CCBot User-agent: ChatGPT-User User-agent: Claude-SearchBot User-agent: Claude-User User-agent: DuckAssistBot User-agent: DuckDuckBot User-agent: Ecosia User-agent: Googlebot User-agent: ia_archiver User-agent: KagiBot User-agent: MistralAI-User User-agent: MJ12bot User-agent: OAI-SearchBot User-agent: Qwantify User-agent: RyteBot User-agent: SemrushBot User-agent: SISTRIX User-agent: StartpageBot Crawl-delay: 20 Disallow: /*.0 Disallow: /*.1 Disallow: /*.2 Disallow: /*.3 Disallow: /*.4 Disallow: /*.5 Disallow: /*.6 Disallow: /*.7 Disallow: /*.7z Disallow: /*.8 Disallow: /*.9 Disallow: /*.app Disallow: /*.application Disallow: /*.backup Disallow: /*.bak Disallow: /*.bin Disallow: /*.bz2 Disallow: /*.cfg Disallow: /*.cgi Disallow: /*.conf Disallow: /*.config Disallow: /*.crt Disallow: /*.csr Disallow: /*.css Disallow: /*.csv Disallow: /*.dat Disallow: /*.db Disallow: /*.dev Disallow: /*.disabled Disallow: /*.dist Disallow: /*.doc Disallow: /*.docx Disallow: /*.env Disallow: /*.example Disallow: /*.exe Disallow: /*.feed Disallow: /*.gz Disallow: /*.ics Disallow: /*.ini Disallow: /*.js Disallow: /*.json Disallow: /*.kdbx Disallow: /*.key Disallow: /*.local Disallow: /*.lock Disallow: /*.log Disallow: /*.md Disallow: /*.mjs Disallow: /*.mp4 Disallow: /*.new Disallow: /*.numbers Disallow: /*.odp Disallow: /*.ods Disallow: /*.odt Disallow: /*.old Disallow: /*.orig Disallow: /*.original Disallow: /*.pages Disallow: /*.pem Disallow: /*.php7 Disallow: /*.pl Disallow: /*.ppt Disallow: /*.pptx Disallow: /*.prod Disallow: /*.production Disallow: /*.properties Disallow: /*.psd Disallow: /*.py Disallow: /*.rar Disallow: /*.rb Disallow: /*.rtf Disallow: /*.save Disallow: /*.sh Disallow: /*.sql Disallow: /*.sqlite Disallow: /*.sqlite3 Disallow: /*.staging Disallow: /*.temp Disallow: /*.testing Disallow: /*.tgz Disallow: /*.tmp Disallow: /*.tsv Disallow: /*.txt Disallow: /*.vcf Disallow: /*.woff Disallow: /*.woff2 Disallow: /*.xls Disallow: /*.xlsx Disallow: /*.xz Disallow: /*.yaml Disallow: /*.yml Disallow: /*.zip Disallow: /.env Disallow: /.env.local Disallow: /.env.production Disallow: /.bzr/ Disallow: /.git/ Disallow: /.hg/ Disallow: /.pki/ Disallow: /.ssh/ Disallow: /.svn/ Disallow: /3rdparty/ Disallow: /admin/ Disallow: /administrator/ Disallow: /assets/ Disallow: /backups/ Disallow: /bin/ Disallow: /cache/ Disallow: /cfg/ Disallow: /cgi-bin/ Disallow: /classes/ Disallow: /conf/ Disallow: /config/ Disallow: /core/ Disallow: /dist/ Disallow: /docs/ Disallow: /export/ Disallow: /extensions/ Disallow: /fonts/ Disallow: /git/ Disallow: /includes/ Disallow: /install/ Disallow: /installer/ Disallow: /js/ Disallow: /layouts/ Disallow: /lib/ Disallow: /libraries/ Disallow: /log/ Disallow: /logs/ Disallow: /maintenance/ Disallow: /modules/ Disallow: /node_modules/ Disallow: /plugins/ Disallow: /scripts/ Disallow: /settings/ Disallow: /setup/ Disallow: /skins/ Disallow: /src/ Disallow: /templates/ Disallow: /templates_c/ Disallow: /themes/ Disallow: /tmp/ Disallow: /update/ Disallow: /updater/ Disallow: /var/ Disallow: /vendor/ Disallow: /wp-admin/ Disallow: /wp-includes/ # Block 3: Bots die nicht in Block 1 + 2 genannt sind, ist das crawlen untersagt User-agent: * Disallow: / # Integrieren Sie an dieser Stelle die URL zu Ihrer sitemap.xml und entfernen die Raute am Anfang der Zeile # Sitemap: https://beispiel.de/sitemap.xml