# robots.txt for www.unitracc.com Sitemap: https://www.unitracc.com/sitemap.xml.gz # Define access-restrictions for robots/spiders # http://www.robotstxt.org/wc/norobots.html # By default we allow robots to access all areas of our site # already accessible to anonymous users User-agent: * Disallow: # AFAWK, only the Yandex robot supports the Clean-param directive: User-agent: Yandex # https://yandex.ru/support/webmaster/robot-workings/clean-param.html?lang=en Clean-param: set_language / # Add Googlebot-specific syntax extension to exclude forms # that are repeated for each piece of content in the site # the wildcard is only supported by Googlebot # http://www.google.com/support/webmasters/bin/answer.py?answer=40367&ctx=sibling User-Agent: Googlebot Disallow: /*sendto_form$ Disallow: /*folder_factories$
Folgende Keywords wurden erkannt. Überprüfe die Optimierung dieser Keywords für Deine Seite.
(Nice to have)