# This robots.txt file controls crawling of URLs under https://example.com. # All crawlers are disallowed to crawl files in the "includes" directory, such # as .css, .js, but Google needs them for rendering, so Googlebot is allowed # to crawl them. User-agent: Googlebot Disallow: /buyer/ Disallow: /checkout/ Disallow: /bbs/ Disallow: /order-tracking Disallow: /point Disallow: /blog/Workplace-Health/* Disallow: *?utm_source=awin* Disallow: */7th-anniversary Disallow: /game-center/ User-agent: * Disallow: /buyer/ Disallow: /checkout/ Disallow: /bbs/ Disallow: /order-tracking Disallow: /point Disallow: /blog/Workplace-Health/* Disallow: *?utm_source=awin* Disallow: */7th-anniversary Disallow: /game-center/ Sitemap: https://www.flexispot.es/sitemap.xml
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)