#**************************************************************************** # robots.txt # : Robots, spiders, and search engines use this file to determine which # content they should *not* crawl while indexing your website. # : This system is called "The Standard Robots Exclusion." #**************************************************************************** User-agent: * Disallow: /cgi-bin/ Disallow: /en/register Disallow: /en/login Disallow: /fr/register Disallow: /fr/login Disallow: /de/register Disallow: /de/login User-agent: Googlebot Allow: / User-agent: Googlebot-Image Allow: /_nuxt/img/ User-agent: Mediapartners-Google Allow: / User-agent: AdsBot-Google Allow: / User-agent: AdsBot-Google-Mobile Allow: / User-agent: Bingbot Allow: / User-agent: Msnbot Allow: / User-agent: msnbot-media Allow: /_nuxt/img/ User-agent: Applebot Allow: / User-agent: Yandex Allow: / User-agent: YandexImages Allow: /_nuxt/img/ User-agent: Slurp Allow: / User-agent: DuckDuckBot Allow: / User-agent: Qwantify Allow: / Sitemap: https://newco.ch/sitemap.xml User-agent: * Disallow:
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)