# See https://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file User-agent: * Disallow: /works? # cruel but efficient Disallow: /autocomplete/ Disallow: /downloads/ Disallow: /external_works/ # disallow indexing of search results Disallow: /bookmarks/search? Disallow: /people/search? Disallow: /tags/search? Disallow: /works/search? User-agent: Googlebot Disallow: /autocomplete/ Disallow: /downloads/ Disallow: /external_works/ # Googlebot is smart and knows pattern matching Disallow: /works/*? Disallow: /*search? Disallow: /*?*query= Disallow: /*?*sort_ Disallow: /*?*selected_tags Disallow: /*?*view_adult Disallow: /*?*tag_id Disallow: /*?*pseud_id Disallow: /*?*user_id Disallow: /*?*pseud= User-agent: CCBot Disallow: / User-agent: GPTBot Disallow: / User-agent: ChatGPT-User Disallow: / User-agent: Slurp Crawl-delay: 30
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)