We crawl websites responsibly to provide the backlink status data that is essential for Monitor Backlinks.

You might want to whitelist our crawler if you are blocking some crawlers, and want to make sure that our crawler is able to access your site.

Alternatively, if you are not a Monitor Backlinks user, you might want to just block our crawler to minimize any load on your site (we do crawl responsibly, so any load would be minimal). 

There are 2 ways:

  1. Update your robots.txt to block any crawling (this would apply to all crawlers though, including Google).
  2. Block or whitelist the user agent: MBCrawler/1.0. This is the user agent for our crawler. 
Did this answer your question?