Google Search Console is the official Google tool used by website owners to monitor the status of their websites in the Google search engine index – both indexed and partially technical. It provides insight into how web robots perceive and interpret a given website.
If you take some manual action on the website, Google Search Console is the only source of the situation. Among tools such as Ahrefs and Majestic, GSC is the most reliable source of information about the link profile of a domain. Due to the ‘’Effectiveness’’ webpage bookmark (previously ‘’Search Analytics’’) you can monitor the visibility of the website in organic search results.
The “Effectiveness” tab in the new version of Google Search Console allows you to analyze the website’s visibility in organic search results in terms of three basic values:
Additionally, based on the number of impressions and clicks, the fourth metric is calculated – CTR (Click Through Rate), i.e., the click-through rate.
The number of views is the number of links to the website that have been displayed to the user. In terms of text search results, the situation is simple, if one user receives one link to the icea-group.com domain for the query website positioning, then one occurrence is counted. If the link is visible only on the second page that the user will not reach, the impression will not be counted.
In the case of graphical search results that do not require going to the next page (infinity scroll), the actual display of the graphic on the user’s screen is counted.
The number of clicks is a simple numerical value showing how many times Internet users clicked on a link to a given domain in the search results. For the sake of personalization and geolocation of search results, each search engine response may differ from each other. User A, searching for the phrase “positioning” in Warsaw, will receive a different set of responses than user B, after entering the same phrase in Poznań. With that in mind, the average position is available on your performance report. This is nothing more than the arithmetic mean, calculated for the website’s highest result in organic searches.
For example, if one query “poznań positioning” displays a link to the domain grup-icea.pl on the 3rd position, and the other on the 5th, the report would include the value (3 + 5) / 2 = 4. At the moment when the actual position of the phrase depends on the personalization and geolocation of the results, and is also subject to daily fluctuations, the average position is a safe and reliable metric for assessing the situation.
The performance report presents the data in a timeline and in the form of a dynamic table. Additionally, it has many configuration options such as:
Data can also be exported to a CSV file or a Google Sheet. Below are some sample configurations:
1. average position of the phrase “positioning” in the last six months,
2. number of domain views for the past 12 months,
3. the number of views of links to the page on mobile devices compared to computers in the last three months,
4. the number of clicks in the last 6 months year by year,
5. the number of views of a link to the home page of domain.com in the last year,
6. List of phrases on which the domain.com home page is displayed.
The URL Inspection Tool (and. URL Inspection Tool) is one possible way to verify that a given URL is included in the site index. It informs about possible errors in the process of scanning and indexing a subpage, and also allows you to submit an individual request to index the address.
Available in the “Test of published version” tool, it also allows you to verify whether the new or modified address will be correctly read by search engine robots.
The last option offered by the URL checking tool is to verify the broadly understood improvement is available on a given subpage. These are primarily all microdata that will be detected on the website, but also, inter alia, test of adapting the website to mobile devices, such as smartphones and tablets. Mobile Friendly Test, which is the test of adapting the website to mobile devices, was arranged in a separate article.
The Google crawl status report helps you catch errors that Googlebot encountered while indexing your site. The standard version of the report applies to all known subpages in a given domain, but if a sitemap is reported by Search Console, you can filter the list of messages, e.g., including those that apply only to addresses from the map.
Web robots are not infallible, and many times the errors described in this report do not actually occur. Usually, it is enough to submit a request for re-verification of a given set of addresses. Nevertheless, the target situation should be zero errors in this report.
[tip] A large number of banned addresses is perfectly normal. Googlebot tries to index all subpages available on the website, many of which, for example, are redundant duplicates.
Type | Recommended action if the address should be properly indexed |
Could not find submitted URL (404) | Verify if the 404 error still exists and fix it |
The submitted URL contains the nag “noindex” | Changing the meta value of the robot tag to “index, follow” |
The submitted URL contains crawl errors | Verify the URL using the “URL Checker” tool |
The submitted URL is blocked by the robots.txt file | Removing the directive blocking subpage indexation from the robots.txt file or adding an exception |
Server Error (5xx) | Determine the cause of the 5xx error and fix it |
Messages for correctly indexed addresses
Type | Explanation |
Page submitted and indexed | Manually indexed URLs (via sitemap or URL Checker tool) |
Page indexed but not submitted in sitemap | Addresses indexed but found by Googlebot |
Common reasons for excluding an address from the index
Type | Should action be taken? |
Alternate page containing valid canonical tag | No |
Duplicate, sent URL has not been marked as canonical | Yes, it is worth verifying which subpage has been selected by Google as canonical for this address. This may indicate a larger scale of duplicates on the page. |
Duplicate, the user has not market the canonical page. | Yes, it is worth verifying why Google considered this subpage a duplicate. |
Scanning irregularity | Yes, it is worth verifying with the URL checker what caused the problem (code 4xx? 5xxx?) |
Not found (404) | No, as 404 errors are a perfectly natural situation, However, it is worth verifying whether there is an appropriate subpage to which you can redirect a non-existent address. |
Seeming error 404 | Yes, subpages that do not really exist should return the response code 404 Not Found, and this message means that the server is returning the code 200 OK |
Page excluded by tag “noindex” | No, assuming that “noindex” was implemented intentionally and knowingly |
Page detected – currently not indexed | No |
The page is blocked by the robots.txt file | Yes, if the page should not be indexed, it is worth replacing the block in the robots.txt file with the tag “noindex” |
The page contains a redirect | No, assuming that all redirects are intentionally implemented on the website |
Page scanned but not indexed yet | No |
After taking corrective action on the selected error/ warning, it is worth reporting the corrections. There is a “Check Fix” button in the details view of the bug. Depending on the number of affected URLs, verification may take up to several days.
Get started
with the comprehensive
SEO audit
Invest in a detailed SEO audit and understand your online performance. We analyze your website to get a clear view of what you can improve.