# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file User-agent: * # Disallow search pages from being indexed because we'll be penalized for # duplicate content Disallow: /*search?q= # Disallow URLs that adds tour items into the cart Disallow: /*/shopping_cart_items/new # These pages which require login Disallow: /cart$
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)