# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html User-agent: * # New Rules # Broken URL Structure # Search and Login Pages Disallow: /global-search # Paths (clean URLs) Disallow: /bin/register Disallow: /bin/password Disallow: /bin/login Disallow: /bin/forgot-password Disallow: /bin/api/register Disallow: /bin/api/newsletter/subscribe Disallow: /bin/api/forgot-password Disallow: /bin/api/password Disallow:/bin/api/login Disallow:/bin/api/documents # Paths (no clean URLs) Disallow: /?q=admin/ Disallow: /?q=comment/reply/ Disallow: /?q=filter/tips/ Disallow: /?q=node/add/ Disallow: /?q=search/ Disallow: /?q=user/password/ Disallow: /?q=user/register/ Disallow: /?q=user/login/ Disallow: /?q=user/logout/ sitemap: https://www.molto.de/sitemap.xml
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)