Error ErrCrawlBlockedByRobotsTxt


Some links were not visited because they were blocked by a ROBOTS.TXT file.


A ROBOTS.TXT file is a digital «keep out» sign placed on a web server by the server administrator. Adding the following entries to the top of the robots.txt file, before any Disallow: directives, will bypass any blocks intended for other web crawlers:

 User-agent: PowerMapper Allow: / 

Applicable Standards

Change history

This page describes a web site issue detected by SortSite Desktop and OnDemand Suite.

Rule ID: ErrCrawlBlockedByRobotsTxt