# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file User-agent: * Sitemap: https://shopify.dev/sitemap.xml Disallow: /*?*shpxid=* Disallow: /beta/ Disallow: /workshops/ Disallow: /api/shipping-partner-platform/ Disallow: /docs/api/shipping-partner-platform/ # disallow Common Crawl bot in effort to prevent being added to the Common Crawl dataset (used in GPT training) User-agent: CCBot Disallow: /apps/default_app_home # disallow ChatGPT plugins from accessing certain routes User-agent: ChatGPT-User Disallow: /apps/default_app_home
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)