We do our best to crawl sites comprehensively, but in some rare cases, we don't crawl. This includes:
 - websites that specifically block our bot in robots.txt we don't crawl
 - CDNs sometimes that block all bots from crawling except Google. 

Again, these are very rare cases compared to the number of domains we crawl.

Did this answer your question?