Links blocked by robots.txt Error

Description

Some links were not visited because they were blocked by a robots.txt file or robots meta tag.

Help

A robots.txt file is a digital “keep out” sign placed on a web server by the server administrator. Adding the following entries to the robots.txt file will bypass any blocks intended for other web crawlers:

 User-agent: PowerMapper Allow: / 

Applicable standards

Change history

  • 2.0 Dec 2007 Added.

This page describes a web site issue detected by SortSite Desktop and OnDemand Suite.

Rule ID: ErrCrawlBlockedByRobotsTxt