# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file # Allow crawling of all content User-agent: * Disallow: /sign_in Disallow: /register Sitemap: https://www.geodienste.ch/sitemap.xml
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)