# # robots.txt # Created: Mon, 10 Oct 2009 13:02:03 GMT # # Please note: There are a lot of pages on this site, and there are # some misbehaved spiders out there. If you're # irresponsible, your access to the site may be blocked. # User-agent: psbot Disallow: / User-agent: WebReaper Disallow: / User-agent: ScoutJet Disallow: / User-agent: wget Disallow: / # http://eurobot.ayell.eu/ User-agent: Eurobot Disallow: / # http://www.cuill.com/twiceler/robot.html User-agent: twiceler Disallow: / User-agent: Gaisbot Disallow: / User-agent: dotbot Disallow: / User-agent: WWW-Mechanize Disallow: / User-agent: MJ12bot Disallow: / User-agent: gonzo Disallow: / User-agent: gonzo* Disallow: / User-agent: SapphireWebCrawler Disallow: / # aka Nutch # http://www.amfibi.com/cabot/ User-agent: Cabot Disallow: / # http://spider.acont.de/ User-agent: ACONTBOT Disallow: / User-agent: TurnitinBot Disallow: / User-agent: CatchBot Disallow: / User-agent: WebRankSpider Disallow: / User-agent: yacy Disallow: / User-agent: yacybot Disallow: / User-agent: Mail.Ru Disallow: / User-agent: SurveyBot Disallow: / User-agent: SurveyBot_IgnoreIP Disallow: / User-agent: Yanga WorldSearch Bot Disallow: / #http://www.setooz.com/oozbot.html User-agent: OOZBOT Disallow: / #http://www.botje.com/plukkie.htm User-agent: plukkie Disallow: / User-agent: http://www.uni-koblenz.de/~flocke/robot-info.txt Disallow: / User-agent: NaverBot Disallow: / User-agent: Yeti Disallow: / User-Agent: iis3bot Disallow: / User-agent: Gigabot Disallow: / # http://www.mojeek.com/bot.html User-agent: MojeekBot Disallow: / User-agent: citenikbot Disallow: / User-Agent: Charlotte Disallow: / User-agent: Exabot Disallow: / User-agent: VEDENSBOT Disallow: / User-agent: Lexxebot Disallow: / User-agent: ViolaBot Disallow: / User-agent: Sosospider Disallow: / User-agent: Tagoobot Disallow: / User-agent: cityreview Disallow: / User-agent: Euripbot Disallow: / User-Agent: Butterfly Disallow: / User-agent: isara-search Disallow: / User-agent: Jyxobot Disallow: / User-agent: 008 Disallow: / # specialists ------------------- User-agent: MLBot Disallow: / User-agent: libwww-perl Disallow: / User-agent: Nutch Disallow: / User-agent: nutch-agent Disallow: / User-agent: panscient.com Disallow: / User-agent: BotOnParade Disallow: / User-agent: Yandex Disallow: / User-agent: jobs.de-Robot Disallow: / User-agent: * Disallow: /netz/ Disallow: /controls/ Disallow: /App_Themes/ Disallow: /WebResource.axd Disallow: /Login.aspx
Folgende Keywords wurden erkannt. Überprüfe die Optimierung dieser Keywords für Deine Seite.
(Nice to have)