This page is not available

This site can not be crawled by the Seobility bot. The robots.txt file restricts access for our bot. No URLs can be crawled that deny access to the user agent Seobility or *.

Problem URL: http://dhcsupplies.com

##############################################################################################################
#This file allows you to control web crawlers access to specific pages on your site.  Web crawlers are 
#programs that search engines run to view and analyze your site to index the content in their search engines.
#Common crawlers include Googlebot and bingbot. These are the default rules defined for your site and include
#pages and directories that crawlers do not need access to.
###############################################################################################################
#

User-agent: *						#These rules apply to all crawlers
Disallow: /store/adm				#Crawlers do not need access to your console, so this rule disallows all console pages for crawlers
Disallow: /store/shopCa			#Crawlers cannot add items to a shopping cart, so prevent them from viewing the cart page
Disallow: /store/WriteR		#Crawlers cannot submit product reviews, so there is no need for them to view this page.  The content of product reviews is on a separate page that is allowed
Disallow: /store/addtoc			#is used in embedded commerce and allows items to be added to the cart.  There is no content on this page and therefore crawlers do not need to see it.
Disallow: /store/OnePageCh	# generally contains no inherent SEO value, and crawlers cannot add items to their cart to be able to access this page, therefore it is disallowed
Disallow: /store/checko		#This page contains no content
Disallow: /*Attrib=					#If you use attributes, these rules allow you to avoid duplicate content warnings when customers filter by specific attributes.
Disallow: /*?Attrib= 				#If you use attributes, these rules allow you to avoid duplicate content warnings when customers filter by specific attributes.
Disallow: *attribs=					#If you use attributes, these rules allow you to avoid duplicate content warnings when customers filter by specific attributes.
Disallow: *Attribs=					#If you use attributes, these rules allow you to avoid duplicate content warnings when customers filter by specific attributes.

User-agent: *
Disallow: /

User-agent: Googlebot
Disallow: 
Crawl-delay: 240

User-agent: Bingbot
Disallow: 

User-agent: Slurp
Disallow: 

User-agent: AdsBot-Google
Disallow: 

User-agent: Googlebot-Image
Disallow: 

User-agent: Googlebot-Mobile
Disallow: 

User-agent: Mediapartners-Google
Disallow: 

User-agent: Googlebot-News
Disallow: 

User-agent: Googlebot-Video
Disallow: 

User-agent: Google-Structured-Data-Testing-Tool
Disallow: 

User-agent: Google-Site-Verification
Disallow:

Cookie Policy

We use cookies to make our site work and also for analytics and advertising purposes. You can enable or disable optional cookies as desired. See the following links for more information.

We need these so the site can function properly

So we can better understand how visitors use our website

So we can serve you tailored ads and promotions