Order a free seo audit

Google Search Console – Step by Step Guide

8 minutes of reading
Google Search Console – Step by Step Guide
Category Analytics

Google Search Console is the official Google tool used by website owners to monitor the status of their websites in the Google search engine index – both indexed and partially technical. It provides insight into how web robots perceive and interpret a given website.

If you take some manual action on the website, Google Search Console is the only source of the situation. Among tools such as Ahrefs and Majestic, GSC is the most reliable source of information about the link profile of a domain. Due to the ‘’Effectiveness’’ webpage bookmark (previously ‘’Search Analytics’’) you can monitor the visibility of the website in organic search results.

Contents:

"Effectiveness" - domain visibility monitoring

The “Effectiveness” tab in the new version of Google Search Console allows you to analyze the website’s visibility in organic search results in terms of three basic values:

  • the number of clicks,
  • number of views
  • and the middle position.

Additionally, based on the number of impressions and clicks, the fourth metric is calculated – CTR (Click Through Rate), i.e., the click-through rate.

The number of views is the number of links to the website that have been displayed to the user. In terms of text search results, the situation is simple, if one user receives one link to the icea-group.com domain for the query website positioning, then one occurrence is counted. If the link is visible only on the second page that the user will not reach, the impression will not be counted.

In the case of graphical search results that do not require going to the next page (infinity scroll), the actual display of the graphic on the user’s screen is counted.

The number of clicks is a simple numerical value showing how many times Internet users clicked on a link to a given domain in the search results. For the sake of personalization and geolocation of search results, each search engine response may differ from each other. User A, searching for the phrase “positioning” in Warsaw, will receive a different set of responses than user B, after entering the same phrase in Poznań. With that in mind, the average position is available on your performance report. This is nothing more than the arithmetic mean, calculated for the website’s highest result in organic searches.

For example, if one query “poznań positioning” displays a link to the domain grup-icea.pl on the 3rd position, and the other on the 5th, the report would include the value (3 + 5) / 2 = 4. At the moment when the actual position of the phrase depends on the personalization and geolocation of the results, and is also subject to daily fluctuations, the average position is a safe and reliable metric for assessing the situation.

The performance report presents the data in a timeline and in the form of a dynamic table. Additionally, it has many configuration options such as:

  • search type (network/ graphics/ video),
  • Data
  • Inquiry
  • Page
  • Country
  • Equipment

Data can also be exported to a CSV file or a Google Sheet. Below are some sample configurations:

1. average position of the phrase “positioning” in the last six months,

2. number of domain views for the past 12 months,

3. the number of views of links to the page on mobile devices compared to computers in the last three months,

4. the number of clicks in the last 6 months year by year,

5. the number of views of a link to the home page of domain.com in the last year,

6. List of phrases on which the domain.com home page is displayed.

"URL checking" - analysis of individual subpages of the domain

The URL Inspection Tool (and. URL Inspection Tool) is one possible way to verify that a given URL is included in the site index. It informs about possible errors in the process of scanning and indexing a subpage, and also allows you to submit an individual request to index the address.

Details that can be found in the Address Status Report:

  • how the web crawler found the address (e.g., via a sitemap),
  • when the last subpage scan took place and whether it was successful,
  • what Googlebot scanned the page (for computers or smartphones),
  • if the subpage is not blocked from indexation (e.g., in the robots.txt file or by the robots meta tag), what is the canonical address of a given subpage (in two variants: indicated by the user and selected by Google).

Available in the “Test of published version” tool, it also allows you to verify whether the new or modified address will be correctly read by search engine robots.

In the report of each tested subpage, we get:

  • HTML code indexed by a network robot,
  • a screenshot of the page seen by Googlebot,
  • server response header,
  • a list of resources (including graphics and scripts) divided into those that have been successfully loaded and those that have not been loaded,
  • a list of JavaScript console messages.

The last option offered by the URL checking tool is to verify the broadly understood improvement is available on a given subpage. These are primarily all microdata that will be detected on the website, but also, inter alia, test of adapting the website to mobile devices, such as smartphones and tablets. Mobile Friendly Test, which is the test of adapting the website to mobile devices, was arranged in a separate article.

Make an appointment for a free audit of your website
Order a free seo audit

"Index> Status" - global domain verification

The Google crawl status report helps you catch errors that Googlebot encountered while indexing your site. The standard version of the report applies to all known subpages in a given domain, but if a sitemap is reported by Search Console, you can filter the list of messages, e.g., including those that apply only to addresses from the map.

Web robots are not infallible, and many times the errors described in this report do not actually occur. Usually, it is enough to submit a request for re-verification of a given set of addresses. Nevertheless, the target situation should be zero errors in this report.

[tip] A large number of banned addresses is perfectly normal. Googlebot tries to index all subpages available on the website, many of which, for example, are redundant duplicates.

The most common errors in the status report

Type Recommended action if the address should be properly indexed
Could not find submitted URL (404) Verify if the 404 error still exists and fix it
The submitted URL contains the nag “noindex” Changing the meta value of the robot tag to “index, follow”
The submitted URL contains crawl errors Verify the URL using the “URL Checker” tool
The submitted URL is blocked by the robots.txt file Removing the directive blocking subpage indexation from the robots.txt file or adding an exception
Server Error (5xx) Determine the cause of the 5xx error and fix it

Messages for correctly indexed addresses

Type Explanation
Page submitted and indexed Manually indexed URLs (via sitemap or URL Checker tool)
Page indexed but not submitted in sitemap Addresses indexed but found by Googlebot

Common reasons for excluding an address from the index

Type Should action be taken?
Alternate page containing valid canonical tag No
Duplicate, sent URL has not been marked as canonical Yes, it is worth verifying which subpage has been selected by Google as canonical for this address. This may indicate a larger scale of duplicates on the page.
Duplicate, the user has not market the canonical page. Yes, it is worth verifying why Google considered this subpage a duplicate.
Scanning irregularity Yes, it is worth verifying with the URL checker what caused the problem (code 4xx? 5xxx?)
Not found (404) No, as 404 errors are a perfectly natural situation, However, it is worth verifying whether there is an appropriate subpage to which you can redirect a non-existent address.
Seeming error 404 Yes, subpages that do not really exist should return the response code 404 Not Found, and this message means that the server is returning the code 200 OK
Page excluded by tag “noindex” No, assuming that “noindex” was implemented intentionally and knowingly
Page detected – currently not indexed No
The page is blocked by the robots.txt file Yes, if the page should not be indexed, it is worth replacing the block in the robots.txt file with the tag “noindex”
The page contains a redirect No, assuming that all redirects are intentionally implemented on the website
Page scanned but not indexed yet No

After taking corrective action on the selected error/ warning, it is worth reporting the corrections. There is a “Check Fix” button in the details view of the bug. Depending on the number of affected URLs, verification may take up to several days.

iCEA Group

As a leading international digital marketing agency established in 2007, we have consistently set the standards for traffic acquisition and conversion. We proudly host Poland's largest technical department, staffed with highly skilled SEO, SEM, and UX specialists with years of experience. Our offer knows no borders! As evidenced by our successful campaigns in not only the Polish market but also in India and the United States. Driven by our dedication and expertise, we empower our clients to achieve online success.

Also check
Are you wondering why your website is NOT SELLING?
Schedule a free SEO consultation and find out how we can improve your sales results.

Rate the article
Average rating 5/5 - Number of ratings: 9
Add comment

Your email address will not be published. Required fields are marked *

*

Would you like to see what else we have written about?

Why and how to monitor competitors’ activities?
Why and how to monitor competitors’ activities?
Analyzing your website's position is the key to growing your business. See what monitoring activities performed by competitors in the industry can give you!
What is the Open Rate?
What is the Open Rate?
Unlocking the mystery behind the Open Rate: Discover what it is, why it matters, and how to improve it for your email marketing campaigns.
Domain indexing – how to regularly check progress?
Domain indexing – how to regularly check progress?
Maximize your SEO efforts by regularly monitoring your domain indexing progress. Learn the key techniques in our latest blog post.

Get started

with the comprehensive
SEO audit

Invest in a detailed SEO audit and understand your online performance. We analyze your website to get a clear view of what you can improve.

  • I Please send us a message first for the introduction.
  • II Then, our SEO Expert gets back right to you with a phone call.
  • III We schedule a consultation in time that works for you.
  • IV The SEO Expert audits your website and provides strategic recommendations on how to improve your performance.
  • V You'll get the SEO report with a comprehensive look at numerous search ranking factors such as technical items, on-page, content, and off-page metrics.