We encountered a network error when we tried to access the page. If we receive a 404, we assume that a robots.txt file does not exist and we continue the crawl. Note that this is different from a 404 response when looking for a robots.txt file. When this happens, we return to your site later and crawl it once we can reach your robots.txt file. To make sure we didn't crawl any pages listed in that file, we postponed our crawl. However, your robots.txt file was unreachable. Make sure that your domain is resolving correctly and try again.īefore we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. This could be because your server is down, or there is an issue with the DNS routing to your domain. We couldn't communicate with the DNS server when we tried to access the page. In this case, we'll return again later to crawl additional pages. If the server is busy, it may have returned an overloaded status to ask the Googlebot to crawl the site more slowly. Likely reasons for this error are an internal server error or a server busy error. See RFC 2616 for a complete list of these status codes. Possible URL unreachable errors include: 5xx error Your server may have been down or busy when we tried to access the page. We may have encountered a DNS error or timeout, for instance. Google encountered an error when trying to access this URL.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |