Google Improves Search Console Crawl Report

CAPTCHAFORUM

Administrator
Google has launched an updated version of its Search Console Crawl Report to give site owners a better understanding of how Googlebot is processing their assets.

The report received a number of new functions:
  • Total number of crawl requests, grouped by response, file type scanned, crawl target, and Googlebot type.
  • Detailed host status information.
  • Examples of URLs to show exactly where the requests took place on the site.
  • Detailed overview for resources with multiple hosts and support for domain resources.
Timing charts
Charts that show changes over time: total requests, download size, and average response time.

over-time-charts__6a1e81c0[1].png

Grouping scan data
In the new version of the report, data on scan requests is broken down by response, file type, scan target and Googlebot agent. You can view examples of URLs for each type by clicking on the row in the grouping table:

grouped-crawl-data__940ec2bb[1].png

Details of Host Status Issues
Host status information allows you to check your site's overall availability for Google over the past 90 days.

host-status-details__da385f86[1].png

For domain resources with multiple hosts, you can check the status for each of the main hosts presented in the report. This will help you evaluate the performance of all hosts by resource in one place.

multi-hosts-domain__cbc9d22e[1].png

Overall, the new crawl report provides a better understanding of how Googlebot crawls a site. With it you can:
  • See the scan history in charts that record changes over time;
  • View the types and sizes of files returned by the site;
  • See the details of scan requests in the sample list;
  • Track site availability issues.
More information about the updated report is available in the Search Console Help.