Die Seite ist nicht abrufbar

Diese Seite ist für den Seobility Bot nicht erreichbar. Unser Bot wird von der Robots.txt Datei ausgeschlossen. Der Seobility Bot crawlt nichts was Seobility oder * ausschließt.

Problematische URL: http://www.landeskirche-anhalts.de

User-agent: *
Disallow: /assets/bausteine/
Disallow: /assets/cache/
Disallow: /assets/docs/
Disallow: /assets/drgalleries/
Disallow: /assets/export/
Disallow: /assets/files/erwachsenenbildung_2013-1.pdf
Disallow: /assets/flash/
Disallow: /assets/images/
Disallow: /assets/import/
Disallow: /assets/js/
Disallow: /assets/media/
Disallow: /assets/modules/
Disallow: /assets/plugins/
Disallow: /assets/site/
Disallow: /assets/snippets/
Disallow: /assets/templates/
Disallow: /client/grafik/
Disallow: /client/scripts/
Disallow: /frontend/
Disallow: /logs/
Disallow: /manager/
Disallow: /usage/
Disallow: /aktuell/block
Disallow: /bausteine/
Disallow: /gemeinden/ballenstedt/nachrichten
Disallow: /gemeinden/bernburg/nachrichten
Disallow: /gemeinden/dessau/nachrichten
Disallow: /gemeinden/koethen/nachrichten
Disallow: /gemeinden/zerbst/nachrichten
Disallow: /gemeinden/ballenstedt/termine
Disallow: /gemeinden/bernburg/termine
Disallow: /gemeinden/dessau/termine
Disallow: /gemeinden/koethen/termine
Disallow: /gemeinden/zerbst/termine
Disallow: /gemeinden/gemeindesuche
Disallow: /landeskirche/nachricht.php
Disallow: /intern/
Disallow: /kontakt/
Disallow: /name
Disallow: /personen/
Disallow: /piwik/

Sitemap: http://www.landeskirche-anhalts.de/sitemap.xml

Version 1.2

##### Suchmaschinen

User-agent: Baiduspider
Disallow: /

User-agent: Exabot
Disallow: /

User-Agent: Gigabot
Disallow: /

User-agent: moget
Disallow: /

User-agent: ichiro
Disallow: /

User-agent: HaosouSpider
Disallow: /

User-agent: NaverBot
Disallow: /

User-agent: SeznamBot
Disallow: /

User-agent: Yeti
Disallow: /

User-agent: sogou spider
Disallow: /

## ->: 2.1.17: 21 Besuche (User-agent an diesem Tag geändert)
User-agent: YandexBot
Disallow: /

####### Spider und diverse Crawler, die sich vermutlich an robots.txt halten ##### 

User-agent: 008
Disallow: /

User-agent: 360Spider
Disallow: /

User-agent: backlink-check.de
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent:  Bilbo
Disallow: /

User-agent: BLEXBot
Disallow: /

User-Agent: BoogleBot
Disallow: /

User-agent: careerbot
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: CheckMarkNetwork/1.0 (+http://www.checkmarknetwork.com/spider.html)
Disallow: /

User-agent: coccoc
Disallow: /

User-agent: Cliqzbot
Disallow: / 

User-agent: Dataprovider
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: DOC
Disallow: /

User-agent: doczz_com_br
Disallow: /

User-agent: Domain Re-Animator Bot
Disallow: /

User-agent: DomainStatsBot
Disallow: /

User-agent: dotbot
Disallow: /

User-agent: Download Ninja
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: eZ Publish Link Validator 
Disallow: /

User-agent: fast
Disallow: /

User-agent: Fasterfox
Disallow: /

User-agent: Fetch
Disallow: /

User-agent: fr-crawler
Disallow: /

User-agent: Gluten Free Crawler
Disallow: /

User-agent: GarlikCrawler
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: IsraBot
Disallow: /

User-agent: ICCrawler
Disallow: /

User-agent: JobboerseBot
Disallow: /

User-agent: jobs.de-Robot
Disallow: /


User-agent: k2spider
Disallow: /

User-agent: larbin
Disallow: /

User-agent: libwww
Disallow: /

User-Agent: linkdex
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: linko
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: Lipperhey-Kaus-Australis
Disallow: /

User-Agent: ltx71
Disallow: /

User-agent: magpie-crawler
Disallow: /

User-agent: meanpathbot
Disallow: /

User-agent: MegaIndex.ru
Disallow: /

User-agent: megaindex.com
Disallow: /

User-agent: mindUpBot
Disallow: /

User-agent: metajobbot
Disallow: /

User-agent: Microsoft.URL.Control
Disallow: /

User-agent: mindUpBot
Disallow: /

User-agent: MJ12bot
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: MojeekBot
Disallow: /

User-agent: MovableType
Disallow: /

User-agent: NaverBot
Disallow: /

User-agent: netEstate NE Crawler
Disallow: /

User-agent: NPBot
Disallow: /

User-agent: oBot
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: OpenHoseBot
Disallow: /

User-agent: Orthogaffe
Disallow: /

User-agent: plukkie 
Disallow: /

User-Agent: psbot
Disallow: /

User-agent: R6_CommentReader
Disallow: /

User-agent: rogerbot 
Disallow: /

User-agent: SafeDNSBot
Disallow: /

User-Agent: SafeSearch 
Disallow: /

User-agent: ScoutJet
Disallow: /

User-agent: Screaming Frog SEO Spider
Disallow: /

User-agent: SearchmetricsBot
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: semantic-vision.com
Disallow: /

User-agent: SemrushBot
Disallow: /

User-Agent: Seobility
Disallow: /

User-agent: SEODAT
Disallow: /


User-agent: SEOdiver
disallow: /

User-agent: SEOENGBot
Disallow: /

User-agent: sg-Orbiter
Disallow: /

User-agent: SMTBot
Disallow: /

User-agent: SurveyBot
Disallow: /


User-Agent: Shareaza
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: sistrix
Disallow: /

User-agent: spbot
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: ThumbSniper
Disallow: /

User-Agent: trendictionbot 
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: turnitinbot
Disallow: /

User-agent: UbiCrawler
Disallow: /

User-agent: UnisterBot
disallow: /

User-agent: URL Control
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: Vagabondo
Disallow: /

User-agent: vebidoobot
Disallow: /

User-agent: voltron
Disallow: /

User-agent: vscooter
Disallow: /

User-agent: Wappalyzer
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: WebReaper
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: wotbox
Disallow: /

User-agent: Xenu
Disallow: /

User-agent: xovi
Disallow: /

User-agent: Zao
Disallow: /

User-agent: Zealbot
Disallow: /

User-agent: ZoomBot
Disallow: /

User-agent: ZyBORG
Disallow: /

User-agent: Yeti
Disallow: /

################# robots.txt wird höchstwahrscheinlich ignoriert ##################

## 6.1. 244 Besucher -> htaccess
User-agent: BacklinkCrawler
Disallow: /

## 6.1. 64 Besuche - htaccess (domainappender)
User-Agent: DomainAppender
Disallow: /


## 7.1. 137 Besuche - htaccess (kraken), 8.1. 156 (kraken\/0.1)
User-agent: Kraken
Disallow: /

## 6.1. 17 Besuche htaccess (linkdexbot)
User-agent: linkdexbot
Disallow: /

## 7.1. 32 Besuche 10.1. 65 (mixrankbot)
User-agent: MixrankBot
Disallow: /

## 6.1. 32 Besuche - htaccess (seokicks-robot)
User-agent: SEOkicks-Robot
Disallow: /


## 5.1. 9 Besucher
User-agent: semantic-visions.com 
Disallow: /

##  6.1. 145 htaccess (seoscanners), 8.1. 157 Besucher (seoscanners.net)
User-agent: seoscanners
Disallow: /


## 7.1. 141 Besucher htaccess (sophora)
User-agent: Sophora
Disallow: /

## 7.1. 52 Besucher, 8.1 62
User-agent: um-IC
Disallow: /