# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file User-agent: * Disallow: /calendar/ Disallow: /observations? Disallow: /observations/? Disallow: /observations.csv Disallow: /observations.csv? Disallow: /taxa/search Disallow: /taxa/search? Disallow: /*? Disallow: /taxa/*/description$ Disallow: /taxa/*/map_layers$ Disallow: /listed_taxa/* Disallow: /lifelists/* Disallow: *.atom* Disallow: *.csv* Disallow: *.json* Disallow: *page=* Disallow: /*? User-agent: ia_archiver Disallow: User-agent: dotbot Disallow: / User-agent: Twitterbot Disallow:
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)