# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file # # To ban all spiders from the entire site uncomment the next two lines: User-agent: * Disallow: /checkout Disallow: /cashier Disallow: /cart Disallow: /orders Disallow: /ahoy Disallow: /states Disallow: /users Disallow: /account Disallow: /user_diet_preferences Disallow: /api/frontend/notifications Disallow: /api/farmy/delivery_slots/nearest_delivery_date Disallow: /api/frontend/hubs/delivery_terms Disallow: /hubs/promo_type_from_zipcode Disallow: /ng/templates/content_promos/content_promo_dock Disallow: /api/frontend/orders/current_free_products Disallow: /api/frontend/orders/incomplete Disallow: /ng/templates/partners/partner_return_to_btn # Slow down Bingbot User-agent: bingbot Crawl-delay: 3 Disallow: /checkout Disallow: /cashier Disallow: /cart Disallow: /orders Disallow: /ahoy Disallow: /states Disallow: /users Disallow: /account Disallow: /api # The Audisto Full Crawler User-agent: audisto Disallow: /checkout Disallow: /cashier Disallow: /orders Disallow: /ahoy Disallow: /states Disallow: /users # Semrush User-agent: SemrushBot Disallow: / User-agent: SemrushBot-SA Disallow: /
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)